In celebration of Italy’s 2026 Milano-Cortina Olympics, here’s some Fall 2025 research (suitable anytime of year) into the science of spaghetti and the cooking of a pasta sauce, cacio e pepe.
Why does your spaghetti (especially the gluten-free kind) get mushy?
Associate professor Andrea Scotti explains,
An October 28, 2025 news item on ScienceDaily announced research into spaghetti from Sweden’s Lund University,
Why doesn’t spaghetti fall apart when it’s boiled? According to new scientific findings, the key is gluten. The amount of salt added to the cooking water also plays a surprisingly important part in keeping pasta firm and intact.
Using advanced techniques, researchers examined the internal structure of regular and gluten-free spaghetti – straight off the shelf. The results show that gluten has a crucial role in protecting the structure of pasta during cooking.
“We were able to show that the gluten in regular spaghetti acts as a safety net that preserves the starch. The gluten-free pasta, which contains an artificial matrix, only works optimally under exactly the right cooking conditions – otherwise the structure easily falls apart,” says Andrea Scotti, senior lecturer in physical chemistry at Lund University.
Scotti used both small-angle neutron scattering and X-rays in the research. These methods make it possible to study foods at the microscopic level – down to a billionth of a metre – and link these findings to product characteristics such as texture, shelf life and glycaemic index.
The new study also concludes that the salt in the pasta water plays a role in the end result.
“Our results show that regular pasta has higher tolerance, or better structural resistance, to less optimal cooking conditions such as being cooked for too long or too much salt being added to the water. So, cooking pasta with the right amount of salt is not just a matter of taste – it also affects the microstructure of the pasta and thus the whole dining experience,” says Andrea Scotti.
The researchers now plan to continue their work by studying more types of pasta and different manufacturing conditions, as well as replicating what happens to the pasta once it is in the stomach, to see what effect digestion has on its chemical structure.
“With demand for gluten-free alternatives increasing, we hope that our methods can help develop more durable and nutritious products that stand up to the demands placed on them by both the cooking process and by consumers,” says Andrea Scotti.
The research was conducted together with Judith Houston, lead instrument scientist for the LoKI instrument at the European Spallation Source (ESS) in Lund, Sweden, and collaborators from the Institut Laue-Langevin in France and the Diamond Light Source and ISIS Neutron and Muon Source in the UK [United Kingdom].
The UK’s (United Kingdom) Diamond Light Source facility issued a September 5, 2025 press release taking a more technical approach to describing the work,
Want tastier gluten-free spaghetti? Small angle scattering shows you need to follow the instructions.
Using small angle neutron and X-ray scattering, researchers from the European Spallation Source and RWTH Aachen University have compared the nanostructure of gluten-free and normal spaghetti, finding that the kind with gluten is much more forgiving to varied cooking conditions.
Andrea Scotti from RWTH Aachen University, Judith Houston from the European Spallation Source (ESS) have worked with Nathan Cowieson from Diamond’s B21 beamline and Greg Smith from ISIS Neutron and Muon Source as well as collaborators from the Institut Laue Langevin to study the nanostructure of spaghetti. More specifically, they were looking at the different structures created by gluten-free spaghetti, in comparison to gluten-containing spaghetti.
Normal pasta is made up mostly of starch and gluten. Starch forms ball-like structures that expand when the pasta is boiled. Gluten, however, is more of a stringy mesh. It tangles around the balls of starch, preventing them from falling to pieces upon expansion. Gluten-free options need to overcome this problem through other means. Currently, these tend to leave the pasta with a strange chewy texture, for a generally less appealing experience in comparison to gluten-containing options.
Aiming to improve this mouthfeel, these researchers used small angle scattering to investigate the nanostructure of spaghetti. Their X-ray experiments involved comparing spaghetti when it was raw, boiled for a variety of cooking times, and boiled with salt.
They also saw that salt not only effects the taste, but also the structural integrity. Adding salt preserved the structure of the spaghetti, but only when used at the right concentration, and if the pasta was cooked for the right length of time.
For their neutron experiments, the researchers cooked the spaghetti in D2O in one of the ISIS labs, slicing the spaghetti into tiny pieces before it was loaded it into the sample chamber. By using different mixtures of H2O and D2O, they could make samples that each highlighted a different component of interest, with others appearing invisible to the neutron beam. This meant they could separate the effect of cooking on the starches and the gluten.
They found that the starch granules swell upon cooking, and tend to disperse, whereas the gluten proteins become insoluble and coagulate into a network. This has the effect of trapping the starch and retaining the structure of the pasta.
In the gluten-free spaghetti, this network is missing, which means the starch granules can over-swell. This is why gluten-free pasta can fall apart or become sticky during cooking, especially if cooked for longer than the manufacturer’s instructions.
The researchers plan to continue their work, using small angle neutron and X-ray experiments to study pasta varieties of different shapes and manufacturing conditions, as well as replicating what happens to the pasta once it’s inside the stomach and seeing what effect that has on the structure.
The Ig Nobel Prize honors research that first makes people laugh, then makes them think. Its 35th award ceremony possibly also makes people hungry: ISTA [Institute of Science and Technology Austria] physicist Fabrizio Olmeda and colleagues researched the secret of a perfect cacio e pepe pasta sauce. They received the popular award for their findings on Thursday evening [September 18, 2025] in Boston [at Boston University].
ISTA postdoc Fabrizio Olmeda chose statistical physics in the field of complex systems as his research area because it allowed him to apply theoretical physics to a wide range of disciplines, from biology to sociology. “My motivation will always be to investigate phenomena that fascinate me, even if they lie outside my field of expertise, which is the physics of single-cell genomics,” says the newly awarded Ig Nobel Prize winner. “Despite increasing specialization, I believe that even in my usual field of research, it can be beneficial to take some time to explore something unusual. I think this award reflects this idea, because its motto, ‘First laugh, then think,’ can inspire people to take an interest in science.”
Martin Hetzer, president of ISTA, emphasizes this: “A mentor once told me: As long as you’re having fun, you’re doing it right. The Ig Nobel Prize is a wonderful tribute to this credo. At first, the question of how to prepare the perfect Caio e Pepe pasta may sound funny. But real curiosity-driven research brings together creativity, perseverance, precision, and fun. And it always leads to discoveries that have the potential to improve our world a little bit—on a large scale with innovations or on a small scale on our plates.”
What’s simmering in the lab? The recipe for delicious research
And that is the essence of the peer-reviewed study published in the scientific journal Physics of Fluids and now honored with the award: Simply mixing the usual ingredients – Pecorino cheese, pasta water, pepper, and pasta – often results in a lumpy, mozzarella-like sauce. Why? The starch in the pasta water is supposed to help emulsify and stabilize the sauce, but it is rarely enough on its own. When the temperature rises above 65 degrees Celsius, the cheese proteins denature and clump together, causing the mixture to break down.
The researchers found that the key to the perfect sauce is the right amount of starch. Simply stir starch powder (2–3% of the cheese mass) into the water until the water becomes clear and thickens. Now mix this gel with the cheese at a low temperature so that the starch binds with the proteins and prevents lumps. Then season with pepper as usual. Mix the pasta with the sauce in the pan and add a little pasta water if necessary to achieve the right consistency.
Ingredients:
4 g starch (potato or corn starch)
40 ml water (to mix the starch)
160 g Pecorino Romano
240 g pasta (ideally tonnarelli)
Pasta cooking water
Black pepper and salt (to taste)
Also at the VISTA Science Experience Center
The research question about the perfect pasta is just one of countless questions and topics that have been and continue to be pursued at ISTA in Klosterneuburg by outstanding scientists from around 80 countries. Visitors will soon be able to learn about a selection of these topics—including the now award-winning pasta research—at the VISTA Science Experience Center. The center will open in the heart of the ISTA campus with a festival from October 3 to 5, 2025. Admission is free.
Here’s a link to and a citation for the paper,
Phase behavior of Cacio e Pepe sauce by G. Bartolucci, D. M. Busiello, M. Ciarchi, A. Corticelli, I. Di Terlizzi, F. Olmeda, D. Revignas, V. M. Schimmenti. Physics of Fluids Volume 37 Issue 4 April 2025 044122 (2025) DOI: https://doi.org/10.1063/5.0255841
This paper is open access.
For anyone who wants to know more about the 2025 ig Nobels, there’s (1) a September, 18, 2025 article by Jennifer Ouellette for Ars Technica,
Does alcohol enhance one’s foreign language fluency? Do West African lizards have a preferred pizza topping? And can painting cows with zebra stripes help repel biting flies? These and other unusual research questions were honored tonight in a virtual ceremony to announce the 2025 recipients of the annual Ig Nobel Prizes. Yes, it’s that time of year again, when the serious and the silly converge—for science.
Established in 1991, the Ig Nobels are a good-natured parody of the Nobel Prizes; they honor “achievements that first make people laugh and then make them think.” The unapologetically campy awards ceremony features miniature operas, scientific demos, and the 24/7 lectures whereby experts must explain their work twice: once in 24 seconds and the second in just seven words.
Acceptance speeches are limited to 60 seconds. And as the motto implies, the research being honored might seem ridiculous at first glance, but that doesn’t mean it’s devoid of scientific merit. In the weeks following the ceremony, the winners will also give free public talks, which will be posted on the Improbable Research website.
Without further ado, here are the winners of the 2025 Ig Nobel prizes.
…
There’s also (2) this undated posting on Improbable Science, which provides videos of the event and more, as well as, a list of in person events continuing right into January 2026 in Tokyo and a request,
…
We Ask for Your Help: Donate to the Ig!
As per unfortunate tradition, the Ig Nobel Prize ceremony is funded on a thread of a shoestring. If you or your organization would like to help the Ig thrive, please donate to the Ig!
Caption: Image captured by an electron microscope of a single nanowire memristor (highlighted in colour to distinguish it from other nanowires in the background image). Blue: silver electrode, orange: nanowire, yellow: platinum electrode. Blue bubbles are dispersed over the nanowire. They are made up of silver ions and form a bridge between the electrodes which increases the resistance. Credit: Forschungszentrum Jülich
Not a popsicle but a representation of a device (memristor) scientists claim mimics a biological nerve cell according to a December 5, 2018 news item on ScienceDaily,
Scientists from Jülich [Germany] together with colleagues from Aachen [Germany] and Turin [Italy] have produced a memristive element made from nanowires that functions in much the same way as a biological nerve cell. The component is able to both save and process information, as well as receive numerous signals in parallel. The resistive switching cell made from oxide crystal nanowires is thus proving to be the ideal candidate for use in building bioinspired “neuromorphic” processors, able to take over the diverse functions of biological synapses and neurons.
Computers have learned a lot in recent years. Thanks to rapid progress in artificial intelligence they are now able to drive cars, translate texts, defeat world champions at chess, and much more besides. In doing so, one of the greatest challenges lies in the attempt to artificially reproduce the signal processing in the human brain. In neural networks, data are stored and processed to a high degree in parallel. Traditional computers on the other hand rapidly work through tasks in succession and clearly distinguish between the storing and processing of information. As a rule, neural networks can only be simulated in a very cumbersome and inefficient way using conventional hardware.
Systems with neuromorphic chips that imitate the way the human brain works offer significant advantages. Experts in the field describe this type of bioinspired computer as being able to work in a decentralised way, having at its disposal a multitude of processors, which, like neurons in the brain, are connected to each other by networks. If a processor breaks down, another can take over its function. What is more, just like in the brain, where practice leads to improved signal transfer, a bioinspired processor should have the capacity to learn.
“With today’s semiconductor technology, these functions are to some extent already achievable. These systems are however suitable for particular applications and require a lot of space and energy,” says Dr. Ilia Valov from Forschungszentrum Jülich. “Our nanowire devices made from zinc oxide crystals can inherently process and even store information, as well as being extremely small and energy efficient,” explains the researcher from Jülich’s Peter Grünberg Institute.
For years memristive cells have been ascribed the best chances of being capable of taking over the function of neurons and synapses in bioinspired computers. They alter their electrical resistance depending on the intensity and direction of the electric current flowing through them. In contrast to conventional transistors, their last resistance value remains intact even when the electric current is switched off. Memristors are thus fundamentally capable of learning.
In order to create these properties, scientists at Forschungszentrum Jülich and RWTH Aachen University used a single zinc oxide nanowire, produced by their colleagues from the polytechnic university in Turin. Measuring approximately one ten-thousandth of a millimeter in size, this type of nanowire is over a thousand times thinner than a human hair. The resulting memristive component not only takes up a tiny amount of space, but also is able to switch much faster than flash memory.
Nanowires offer promising novel physical properties compared to other solids and are used among other things in the development of new types of solar cells, sensors, batteries and computer chips. Their manufacture is comparatively simple. Nanowires result from the evaporation deposition of specified materials onto a suitable substrate, where they practically grow of their own accord.
In order to create a functioning cell, both ends of the nanowire must be attached to suitable metals, in this case platinum and silver. The metals function as electrodes, and in addition, release ions triggered by an appropriate electric current. The metal ions are able to spread over the surface of the wire and build a bridge to alter its conductivity.
Components made from single nanowires are, however, still too isolated to be of practical use in chips. Consequently, the next step being planned by the Jülich and Turin researchers is to produce and study a memristive element, composed of a larger, relatively easy to generate group of several hundred nanowires offering more exciting functionalities.
The Italians have also written about the work in a December 4, 2018 news item for the Polytecnico di Torino’s inhouse magazine, PoliFlash’. I like the image they’ve used better as it offers a bit more detail and looks less like a popsicle. First, the image,
Courtesy: Polytecnico di Torino
Now, the news item, which includes some historical information about the memristor (Note: There is some repetition and links have been removed),
Emulating and understanding the human brain is one of the most important challenges for modern technology: on the one hand, the ability to artificially reproduce the processing of brain signals is one of the cornerstones for the development of artificial intelligence, while on the other the understanding of the cognitive processes at the base of the human mind is still far away.
And the research published in the prestigious journal Nature Communications by Gianluca Milano and Carlo Ricciardi, PhD student and professor, respectively, of the Applied Science and Technology Department of the Politecnico di Torino, represents a step forward in these directions. In fact, the study entitled “Self-limited single nanowire systems combining all-in-one memristive and neuromorphic functionalities” shows how it is possible to artificially emulate the activity of synapses, i.e. the connections between neurons that regulate the learning processes in our brain, in a single “nanowire” with a diameter thousands of times smaller than that of a hair.
It is a crystalline nanowire that takes the “memristor”, the electronic device able to artificially reproduce the functions of biological synapses, to a more performing level. Thanks to the use of nanotechnologies, which allow the manipulation of matter at the atomic level, it was for the first time possible to combine into one single device the synaptic functions that were individually emulated through specific devices. For this reason, the nanowire allows an extreme miniaturisation of the “memristor”, significantly reducing the complexity and energy consumption of the electronic circuits necessary for the implementation of learning algorithms.
Starting from the theorisation of the “memristor” in 1971 by Prof. Leon Chua – now visiting professor at the Politecnico di Torino, who was conferred an honorary degree by the University in 2015 – this new technology will not only allow smaller and more performing devices to be created for the implementation of increasingly “intelligent” computers, but is also a significant step forward for the emulation and understanding of the functioning of the brain.
“The nanowire memristor – said Carlo Ricciardi – represents a model system for the study of physical and electrochemical phenomena that govern biological synapses at the nanoscale. The work is the result of the collaboration between our research team and the RWTH University of Aachen in Germany, supported by INRiM, the National Institute of Metrological Research, and IIT, the Italian Institute of Technology.”
This is kind of fascinating. A German research team based at JARA (Jülich Aachen Research Alliance) is suggesting that memristive theory be extended beyond passive components in their paper about Resistive Memory Cells (ReRAM) which was recently published in Nature Communications. From the Apr. 26, 2013 news item on Azonano,
Resistive memory cells (ReRAM) are regarded as a promising solution for future generations of computer memories. They will dramatically reduce the energy consumption of modern IT systems while significantly increasing their performance.
Unlike the building blocks of conventional hard disk drives and memories, these novel memory cells are not purely passive components but must be regarded as tiny batteries. This has been demonstrated by researchers of Jülich Aachen Research Alliance (JARA), whose findings have now been published in the prestigious journal Nature Communications. The new finding radically revises the current theory and opens up possibilities for further applications. The research group has already filed a patent application for their first idea on how to improve data readout with the aid of battery voltage.
The Apr. 23, 2013 JARA news release, which originated the news item, provides some background information about data memory before going on to discuss the ReRAMs,
Conventional data memory works on the basis of electrons that are moved around and stored. However, even by atomic standards, electrons are extremely small. It is very difficult to control them, for example by means of relatively thick insulator walls, so that information will not be lost over time. This does not only limit storage density, it also costs a great deal of energy. For this reason, researchers are working feverishly all over the world on nanoelectronic components that make use of ions, i.e. charged atoms, for storing data. Ions are some thousands of times heavier that electrons and are therefore much easier to ‘hold down’. In this way, the individual storage elements can almost be reduced to atomic dimensions, which enormously improves the storage density.
Here’s how the ions behave in ReRAMs (from the news release),
In resistive switching memory cells (ReRAMs), ions behave on the nanometre scale in a similar manner to a battery. The cells have two electrodes, for example made of silver and platinum, at which the ions dissolve and then precipitate again. This changes the electrical resistance, which can be exploited for data storage. Furthermore, the reduction and oxidation processes also have another effect. They generate electric voltage. ReRAM cells are therefore not purely passive systems – they are also active electrochemical components. Consequently, they can be regarded as tiny batteries whose properties provide the key to the correct modelling and development of future data storage.
In complex experiments, the scientists from Forschungszentrum Jülich and RWTH Aachen University determined the battery voltage of typical representatives of ReRAM cells and compared them with theoretical values. This comparison revealed other properties (such as ionic resistance) that were previously neither known nor accessible. “Looking back, the presence of a battery voltage in ReRAMs is self-evident. But during the nine-month review process of the paper now published we had to do a lot of persuading, since the battery voltage in ReRAM cells can have three different basic causes, and the assignment of the correct cause is anything but trivial,” says Dr. Ilia Valov, the electrochemist in Prof. Rainer Waser’s research group.
This discovery could lead to optimizing ReRAMs and exploiting them in new applications (from the news release),
“The new findings will help to solve a central puzzle of international ReRAM research,” says Prof. Rainer Waser, deputy spokesman of the collaborative research centre SFB 917 ‘Nanoswitches’ established in 2011. In recent years, these puzzling aspects include unexplained long-term drift phenomena or systematic parameter deviations, which had been attributed to fabrication methods. “In the light of this new knowledge, it is possible to specifically optimize the design of the ReRAM cells, and it may be possible to discover new ways of exploiting the cells’ battery voltage for completely new applications, which were previously beyond the reach of technical possibilities,” adds Waser, whose group has been collaborating for years with companies such as Intel and Samsung Electronics in the field of ReRAM elements.
The part I found most interesting, given my interest in memristors, is this bit about extending the memristor theory, from the news release,
The new finding is of central significance, in particular, for the theoretical description of the memory components. To date, ReRAM cells have been described with the aid of the concept of memristors – a portmanteau word composed of “memory” and “resistor”. The theoretical concept of memristors can be traced back to Leon Chua in the 1970s. It was first applied to ReRAM cells by the IT company Hewlett-Packard in 2008. It aims at the permanent storage of information by changing the electrical resistance. The memristor theory leads to an important restriction. It is limited to passive components. “The demonstrated internal battery voltage of ReRAM elements clearly violates the mathematical construct of the memristor theory. This theory must be expanded to a whole new theory – to properly describe the ReRAM elements,” says Dr. Eike Linn, the specialist for circuit concepts in the group of authors. [emphases mine] This also places the development of all micro- and nanoelectronic chips on a completely new footing.
Michael Krämer of the RWTH Aachen University (Germany) muses about philosophy, the Higgs Boson, and more in a Mar. 24, 2013 posting on Jon Butterworth’s Life and Physics blog (Guardian science blogs; Note: A link has been removed),
Many of the great physicists of the 20th century have appreciated the importance of philosophy for science. Einstein, for example, wrote in a letter in 1944:
I fully agree with you about the significance and educational value of methodology as well as history and philosophy of science. So many people today—and even professional scientists—seem to me like somebody who has seen thousands of trees but has never seen a forest.
At the same time, physics has always played a vital role in shaping ideas in modern philosophy. It appears, however, that we are now faced with the ruins of this beautiful marriage between physics and philosophy. Stephen Hawking has claimed recently that philosophy is “dead” because philosophers have not kept up with science …
Krämer is part of an interdisciplinary (physics and philosophy) project at the LHC (Large Hadron Collider at CERN [European Particle Physics Laboratory]), The Epistemology of the Large Hadron Collider. From the project home page (Note: A link has been removed),
This research collaboration works at the crossroads of physics, philosophy of science, and contemporary history of science. It aims at an epistemological analysis of the recently launched new accelerator experiment at CERN, the Large Hadron Collider (LHC). Central themes are (i) the mechanisms of generating the masses of the particles of the standard model, especially the Higgs-mechanism and the Higgs-particle the LHC has set out to detect; (ii) the ongoing research process with special emphasis on the interaction between a large experiment and a community of theoreticians; and (iii) the implications of an experiment that is characterized by its enormous complexity and the need to be highly selective in data gathering. With the heading “Epistemology of the LHC” the research group intends both a philosophical analysis of the theoretical structures and of the conditions of knowledge production, among them the criteria of acceptance, and a real-time monitoring of the ongoing physical development from the perspective of the history of science. Theresearch group has emerged from a collaboration between a High Energy Working group and the Interdisciplinary Centre for Science and Technology Studies and is based in Wuppertal but also involves external members and collaborators.
Krämer shares some of his ideas and the type of thinking generated when physicists and philosophers collide (I plead guilty to the word play; from Butterworth’s Guardian science blog),
… The relationship between experiment and theory (what impact does theoretical prejudice have on empirical findings?) or the role of models (how can we assess the uncertainty of a simplified representation of reality?) are scientific issues, but also issues from the foundation of philosophy of science. In that sense they are equally important for both fields, and philosophy may add a wider and critical perspective to the scientific discussion. And while not every particle physicist may be concerned with the ontological question of whether particles or fields are the more fundamental objects, our research practice is shaped by philosophical concepts. We do, for example, demand that a physical theory can be tested experimentally and thereby falsified, a criterion that has been emphasized by the philosopher Karl Popper already in 1934. The Higgs mechanism can be falsified, because it predicts how Higgs particles are produced and how they can be detected at the Large Hadron Collider.
On the other hand, some philosophers tell us that falsification is strictly speaking not possible: What if a Higgs property does not agree with the standard theory of particle physics? How do we know it is not influenced by some unknown and thus unaccounted factor, like a mysterious blonde walking past the LHC experiments and triggering the Higgs to decay? (This was an actual argument given in the meeting!)
The first international conference and kick-off meeting of the German Society for Philosophy of Science/Gesellschaft für Wissenschaftsphilosophie (GWP) will take place from 11-14 March 2013 at the University of Hannover under the title:
How Much Philosophy in the Philosophy of Science?
Krämer then highlights some of the discussion that most interested in him (Note: A link has been removed),
… It is very hard for a philosopher to keep up with scientific progress, and how could one integrate various fields without having fully appreciated the essential features of the individual sciences? As Margaret Morrison from the University of Toronto pointed out in her talk, if philosophy steps back too far from the individual sciences, the account becomes too general and isolated from scientific practice. On the other hand, if philosophy is too close to an individual science, it may not be philosophy any longer.
I think philosophy of science should not consider itself primarily as a service to science, but rather identify and answer questions within its own domain. I certainly would not be concerned if my own research went unnoticed by biologists, chemists, or philosophers, as long as it advances particle physics. On the other hand, as Morrison pointed out, science does generate its own philosophical problems, and philosophy may provide some kind of broader perspective for understanding those problems.
It’s well worth reading Krämer’s full post for anyone who’s interested in how physicists (or Krämer) think about the role that philosophy could play (or not) in the field of physics.
The reference to Margaret Morrison from the University of Toronto (U of T) reminded me of the Bubble Chamber blog which is written by U of T historians and philosophers of science. Here’s a July 10, 2012 posting by Mike Thicke about the Higgs Boson and his response to philosopher Wayne Myrvold’s (University of Western Ontario) explanation of the statistics claims being made about the particle at that time,
We can all agree that reasoning and decision making in science is complicated. Scientists reason in many different contexts: in the lab, in their published papers, as career-minded professionals, as interested consumers of science, and as people going about their lives. It’s plausible to think that they reason in different ways in all of these contexts. When we’re discussing their reasoning as scientists, I believe distinguishing between the first three contexts is especially important. While Wayne’s explanation of the statistics behind the Higgs Boson discovery is very interesting, informative, and as far as I can tell correct, I think there are some confusions arising from his failure to make these distinctions.
Thicke does advise reading Myrvold’s July 4, 2012 posting before tackling his riposte.