Researchers at the University of Pennsylvania School of Engineering and Applied Science are revisiting a familiar material and finding unexpected connections that extend far beyond materials science. Led by Prof. John C. Crocker, with Prof. Robert A. Riggleman and collaborators, the team has shown that the internal dynamics of foam follow the same mathematical principles used to train modern artificial intelligence systems. Their work suggests that learning, when viewed through mathematics rather than biology or software, may be a general organizing process shared by physical, biological, and computational systems.
Thirumalaiswamy, A., Rodríguez-Cruz, C., Riggleman, R. A., & Crocker, J. C. (2025). Slow relaxation and landscape-driven dynamics in viscous ripening foams. Proceedings of the National Academy of Sciences, 122(47). https://doi.org/10.1073/pnas.2518994122
Foams are common and deceptively simple materials, appearing in products ranging from soaps and foods to industrial emulsions. For decades, they have been treated as glass-like systems, where microscopic bubbles become trapped in disordered but essentially frozen arrangements. This assumption fit well with how foams appear to behave at large scales, where they hold their shape and respond elastically to small deformations.
Prof. John C. Crocker, from University of Pennsylvania stated,
“Why the mathematics of deep learning accurately characterizes foams is a fascinating question. It hints that these tools may be useful far outside of their original context, opening the door to entirely new lines of inquiry.”
Using detailed simulations of wet foams, the Penn researchers found that this picture is incomplete. Even when a foam appears stable, the bubbles within it continue to rearrange. Instead of settling into a single low-energy configuration, the system moves through many nearly equivalent arrangements over time. This slow but persistent motion had been hinted at in earlier experiments, but until recently there was no clear mathematical framework to describe it.
That framework turned out to be familiar from a very different field. Modern deep learning relies on optimization methods that gradually adjust millions or billions of parameters during training. Rather than converging to a single perfect solution, effective models tend to operate in broad, flat regions of their optimization landscape, where many configurations perform similarly well. This behavior helps AI systems generalize beyond their training data.
When the researchers analyzed foam dynamics using the same mathematical tools, the resemblance was clear. Foam bubbles do not roll downhill into deep, stable energy minima. Instead, they remain within extended regions of configuration space where small rearrangements cost little energy. The same equations that explain why deep learning works also describe why foams continue to reorganize without losing their overall structure.
This insight helps resolve a long-standing inconsistency in foam physics. Traditional models treated bubbles as heavy objects moving toward fixed resting points, an approach that could not fully explain experimental observations. By shifting the focus from static equilibria to landscape-driven dynamics, the new work provides a more accurate description of how these materials behave over long timescales.
The implications extend beyond foams. Many biological systems rely on internal structures that must constantly reorganize while maintaining global stability. One example is the cytoskeleton, the network of filaments inside cells that gives them shape and mechanical strength. Like foam, the cytoskeleton is neither rigid nor fully fluid, and its function depends on continuous internal motion. The researchers suggest that similar mathematical descriptions may apply there as well.
From an engineering perspective, the results point toward new ways of thinking about adaptive materials. If stability does not require complete rest, materials could be designed to remain flexible and responsive without losing integrity. The same principles might guide the development of systems that adjust to changing conditions in a controlled way, borrowing ideas from both physics and machine learning.
More broadly, the study highlights how tools developed for artificial intelligence can inform fundamental questions in physics and biology. Rather than viewing learning as a property exclusive to brains or algorithms, the work suggests it may reflect a deeper mathematical principle that governs how complex systems explore possibilities while remaining functional.

Adrian graduated with a Masters Degree (1st Class Honours) in Chemical Engineering from Chester University along with Harris. His master’s research aimed to develop a standardadised clean water oxygenation transfer procedure to test bubble diffusers that are currently used in the wastewater industry commercial market. He has also undergone placments in both US and China primarely focused within the R&D department and is an associate member of the Institute of Chemical Engineers (IChemE).

