
In this post Francesco discusses a highly experimental concept in the world of AI: VortexNet. Unlike other learning systems, VortexNet isn’t inspired by neuroscience or statistics, but by fluid dynamics – flows of information that swirl, oscillate, and resonate like water around an obstacle. Could this radical new concept transcend the limitations of today’s neural networks?
Artificial intelligence is no stranger to bold ideas, but every once in a while a concept appears that feels genuinely alien, something that doesn’t just tweak existing methods, but questions their very foundations. VortexNet is one of those ideas. Inspired not by neuroscience or statistics, but by fluid dynamics , this experimental neural architecture proposes a striking shift in how we think about learning systems: not as chains of static mathematical operations, but as flows of information that swirl, oscillate, and resonate like water around an obstacle. At first glance, it sounds almost too eccentric to take seriously. Yet behind the poetic metaphor lies a rigorous mathematical framework grounded in modified Navier–Stokes equations , the same equations used to describe the motion of fluids. The proposal comes from researcher Samin Winer , and while still at an early, largely theoretical stage, it raises provocative questions about some of the most stubborn limitations in modern AI and in particular, large language models.
WHAT’S BROKEN IN TODAY’S NEURAL NETWORKS?
Despite their remarkable successes, today’s deep learning systems remain constrained by well-known structural problems.
One of the oldest is the vanishing gradient problem . As training signals propagate backwards through deep networks, they often shrink to near zero, leaving early layers with little ability to learn. The consequence is a kind of informational decay: the deeper the network, the harder it becomes to preserve meaningful learning signals at its foundations.
A second challenge is the problem of long-range dependencies While transformers and attention mechanisms have improved the ability to link distant parts of a sequence, the solution is expensive. Attention scales quadratically with input length, making truly long-context reasoning computationally prohibitive.
(VortexNet) proposes a striking shift inhow we think about learning systems: not as chains of staticmathematical operations, but as flows of information thatswirl,oscillate, and resonate like wateraround an obstacle.
Then there is the issue of multiscale processing . Human cognition effortlessly balances fine-grained details with high-level context, switching seamlessly between letters, words, and full meanings. Neural networks struggle to achieve this balance without elaborate architectural tricks.
These problems are not unsolved as much as they are mitigated with workarounds , often at significant computational cost. VortexNet enters precisely at this point of tension, proposing that the solution may come not from more layers or more parameters (as it seems to be the trend nowadays), but from physics itself
Instead of treating information as something that flows linearly from layer to layer, VortexNet treats it as something that moves, rotates, and interacts dynamically , much like a fluid. The guiding inspiration is the von Kármán vortex street , a phenomenon observed when fluid flows past an obstacle, producing alternating vortices in its wake. These vortices transfer energy and information across scales in complex, structured patterns.
Viscosity controls how quickly information diffuses. Convection determines how activations carry information through the system. Forcing represents the external input. Together, these components allow learning signals to circulate and resonate , rather than merely propagate.
A particularly elegant adaptation is the introduction of a Strouhal neural number , inspired by the Strouhal number in fluid mechanics, which predicts vortex shedding frequencies. In the neural context, this ratio governs how activations oscillate across layers, helping the network discover its own natural resonant frequencies, much like pushing a swing at just the right rhythm to amplify motion with minimal effort.
To prevent uncontrolled oscillations, VortexNet introduces an adaptive damping mechanism , continuously adjusting stability during training. The goal is to keep the system at the so-called ‘edge of chaos’: stable enough to learn, yet expressive enough to model complex dynamics.
SO, WHY THIS COULD MATTER?
If VortexNet’s theoretical promises hold, the implications are substantial.
First, vanishing gradients may be alleviated through resonant pathways that allow learning signals to bypass rigid, layer-by-layer attenuation. Instead of information fading as it travels, it can be reinforced through distributed oscillations.
Second, the model introduces the idea of implicit attention . Unlike transformers, which explicitly compare every token to every other token, vortex interactions naturally influence each other through their physics. This could provide a way to model long-range dependencies without quadratic computational cost
Third, VortexNet inherently supports multi-scale processing
Just as large vortices spawn smaller ones in turbulent flows, the architecture can, in principle, represent fine details and global structure simultaneously.
Finally, the system exhibits a form of dynamic memory . Stable oscillatory patterns, known in dynamical systems as attractors, can encode persistent information without requiring explicit memory buffers. Memory, in this view, is not stored statically but maintained through motion
YET ANOTHER SHIFT TOWARD PHYSICS-INSPIRED COMPUTING
VortexNet belongs to a wider movement in AI research that seeks to embed physical principles directly into computation . This includes physicsinformed neural networks (PINNs), neural ordinary differential equations, and neuromorphic hardware. What unites these approaches is a departure from purely symbolic or statistical abstractions toward continuous, dynamical systems
Traditional neural networks are fundamentally digital: discrete layers, discrete operations, discrete transitions. VortexNet is explicitly analogue in spirit . Information does not hop. It flows.
This perspective resonates with what we know about biological brains. Neurons do not merely switch on and off; they oscillate, synchronise, and form transient assemblies. Computation in the brain is not a clean sequence of steps but a dynamic field of interacting signals . Vortex dynamics may capture part of that deeper structure.
The proposed applications are ambitious: long-sequence modelling for genomics and medical records, time-series prediction in finance and climate science, and multimodal learning that naturally integrates vision, sound, and text across scales.
Yet realism is essential. VortexNet remains highly experimental . Its current demonstrations are limited to basic benchmarks such as MNIST digit recognition. There are formidable challenges ahead, among which:
Realism is essential. VortexNet remains highly experimental.
The framework touches chaos theory, dynamical systems, and computational physics, all fields rich with insight but notoriously difficult to tame.
Whether VortexNet itself becomes a mainstream architecture is impossible to predict. But its deeper significance may lie elsewhere. It challenges the assumption that progress in AI must come from bigger models, longer contexts, and more parameters . Instead, it asks a more radical question: What if the very way we move information through a network is wrong?
Today’s AI landscape is dominated by transformer architectures and next-token prediction. VortexNet reminds us that this is not the only possible future. There are entire scientific domains, fluid dynamics, thermodynamics, and nonlinear systems, that remain largely untapped as sources of computational inspiration.
Perhaps the next generation of intelligent systems will not resemble ever-larger language models. Perhaps they will look more like turbulent fluids , where information moves in swirling, resonant patterns rather than straight lines.
It is too early to say whether VortexNet will succeed. But as a provocation, a reminder that artificial intelligence can still be reimagined from first principles, it is one of the most intriguing ideas to surface in recent years.
