I write for business readers, if you would like the math and science I have linked the papershereandhereand at the end of the article.
Imagine Your Brain as a hot Cup of Tea
Now imagine trying to use that cup of tea to train Artificial Intelligence to a level currently thought to be impossible.
Entirely Absurd? Perhaps.
But such is the elegant strangeness of thermodynamic computing, a realm where heat, randomness, and physical fluctuation are harnessed to solve some of the thorniest challenges in artificial intelligence and mathematics. Not about simulating intelligence. It’s about embodying it, using the physics of the system itself.
What if your computer didn’t need a processor, because the laws of physics were the processor?
Welcome to the curious and promising frontier of thermodynamic computation.
The Problem Statement: AI’s Growing Appetite vs. Classical Hardware
Modern AI is rapidly outgrowing the capabilities of traditional computing systems.
- Training deep learning models? Monumentally power-hungry.
- Inverting large matrices? Time-consuming and inefficient.
- Sampling from complex probability distributions? Painfully slow on conventional chips.
GPUs have carried us this far, but as Moore’s Law quietly exits stage left, we need new hardware that aligns the mathematics of AI with the physics of the machine.
Enter thermodynamic computing, part physicist, part poet, and entirely unlike anything in your laptop.
What Is Thermodynamic Computing?
This new approach draws inspiration from nature, specifically, how physical systems tend toward equilibrium.
Picture marbles spilled onto a bumpy surface:
- They roll, bounce, and collide chaotically.
- Eventually, they settle into the deepest valleys.
Now imagine those valleys are the solutions to your problem. Rather than calculate an answer line by line, you construct a physical system that naturally evolves to the correct state.
That’s thermodynamic computing in essence: encode your problem into a physical medium, allow it to settle, and read out the solution once equilibrium is reached.
This is not theory. It works.
How It Works — The Oscillatory Orchestra
Let us begin with Oscillatory Neural Networks (ONNs).
Imagine a network of tiny, rhythmically beating elements, like digital metronomes, each with a “phase” akin to the swing of a pendulum. These oscillators are coupled together, influencing each other’s timing and movement.
By carefully designing their interconnections, you can craft a system that “dances” its way toward a solution. The physics underpinning this behaviour is modelled by the Kuramoto equations, which describe how synchronised systems, like blinking fireflies or neuronal networks, evolve over time.
Within an ONN:
- Each oscillator’s phase encodes part of your problem.
- The coupling strength reflects the structure of a matrix.
- As the system evolves, it converges on a stable, low-energy state , which, miraculously, is the solution.
This is not poetic metaphor. It is grounded in mathematics.
Under the Hood - Energy, Noise, and Natural Solutions
The Core Mechanism: Energy Minimisation
Many computational problems, such as matrix inversion, can be expressed as an energy function. The system wants to evolve to minimise this energy, to find its lowest possible value. This is where thermodynamic computing may just excel: it lets the physical system naturally minimise this function through its dynamics.
Embracing Noise as a Computational Ally
In most digital systems, or Quantum systems noise is the enemy. But in thermodynamic computing, it’s the secret weapon.
Picture yourself dropped in the middle of a vast, foggy moor.
Your mission? Simple on paper: find the cosy village pub, hidden somewhere among the hills and valleys.
Now, if you were a classical computer, you’d start plotting a grid, moving one square at a time, tediously checking each position: “Is this the pub? No. Move on.”
If you were a quantum computer, you might try every route simultaneously, using quantum tunnelling and spooky entanglement to “cheat” the landscape, that’s assuming, of course, the wind doesn’t blow your wavefunction apart, Quantum computing is famously adverse to noise.
But if you’re a thermodynamic computer, your approach is gloriously different.
You roll downhill.
You’re a slightly tipsy traveller, rolling, meandering, stumbling about the moor. You’re not marching with a map; you’re letting gravity (i.e., the energy landscape) guide you.
Imagine if you may: that the moor is riddled with bogs and false trails, local minima. If you were purely logical, you might get stuck in the first ditch that feels comfy and think hey this is the answer.
But because you’re also buffeted by a bit of random breeze blowing over the moor — thermal noise — every now and then you get nudged out of the bog and back on the path. These nudges don’t break you - they help you keep moving toward deeper, better valleys.
Eventually, you stop your roll, when you are nestled in the lowest, cosiest dip in the moor, beside the warm fire of the village pub. That is sort of how it works for the non technical layman.
From Chaotic Motion to Real Results
Once the system reaches equilibrium: (the comfy pub)
- The phases or voltages of components represent the solution.
- Their covariance encodes the inverse of a matrix.
From this, you can train AI models, sample from distributions, or even solve differential equations, all by watching the system settle.
How Does It Compare? Ising Machines, Quantum Annealing, and the Classical Canon
Thermodynamic computing is not alone in this quest. It shares ancestry and aspirations with both Ising machines and quantum annealers, companies like D-Wave and Fujitsu but diverges in crucial ways.
Real Hardware - The Stochastic Processing Unit (SPU)
This isn’t just theory. A team of researchers recently built a working prototype: the Stochastic Processing Unit (SPU).
It consists of just eight RLC circuits, coupled together on a simple printed circuit board. No exotic materials. No need for cryogenic cooling or rare helium3. Just elegant circuitry and thoughtful design.
This little marvel can:
- Sample from Gaussian distributions - smooth, bell-shaped curve that shows how values tend to cluster around an average, with fewer values appearing as you move further away.
- Invert small matrices - Inverting small matrices is crucial because it allows us to solve systems of equations, a core operation in AI, physics, and data modelling.
- Execute key linear algebra operations vital for machine learning - essential in machine learning because they form the backbone of model training, data transformation, and prediction.
And all of this is achieved using physical fluctuations in charge and toggle voltage, not binary code.
It’s a quiet revolution, happening not in data centres, but in the hum of electrical components finding harmony.
Why It Matters - and What Still Needs Solving
The Advantages
- Low Power: Operates at near-thermal limits. Minimal energy, maximal intelligence.
- Speed: Continuous-time evolution, there is no step-by-step instruction set required.
- Resilience: Tolerates noise, even thrives on it.
- Probabilistic Superpower: Perfectly suited for AI tasks that involve uncertainty, distribution sampling, and fuzziness.
The Challenges
- Equilibration Time: The system must be allowed to settle before useful data can be retrieved. (find the pub)
- Precision: As an analogue system, it’s not designed for exact numerical calculations - but that’s often unnecessary in AI workloads.
- Engineering Hurdles: Components like inductors are bulky; scaling will demand design innovation.
- Digital Integration: These systems must eventually plug into our digital world.
When Physics Becomes Computation
Thermodynamic computing isn’t here to replace CPUs, GPUs, or even quantum computers, at least probably not.
Similar to Quantum it’s a new class of processor, not for everything, but perfect for certain AI tasks: matrix operations, probabilistic sampling, energy-based learning, and beyond.
Think of it not as a rival, but as a co-processor for a probabilistic world.
If your business depends on data, optimisation, or AI, this isn’t just scientific novelty. It’s a glimpse of what’s next. Not silicon logic, but thermodynamic intelligence.
A new kind of computer is arriving = and it doesn’t think in bits. It thinks in wiggles, noise, and equilibrium.
Who is working on this? Certainly companies working on annealing are looking into Thermodynamic computing, where as companies like Normal Computing and Extropic are early stage innovators in thermal-dynamic computing.
But don’t consume the marketing, although they are working on chipsets they are not the inventors - The notion of oscillatory computing goes back decades: in the 1950s, vonNeumann and Goto experimented with parametrons — circuits using oscillatory behaviour for logic operations.
Todd Hylton’s 2019 “thermodynamic neural network” model) which uses fluctuation, dissipation, and adaptation in a thermodynamic setting.
But for AI - recent papers - using coupled oscillatory neural networks (ONNs), designed to solve linear algebra problems is a breakthrough with incredible impact for AI if we can now move the science from the lab to fab.
Thermodynamic AI could outperform quantum in several practical dimensions:
- Ambient-temperature operation (vs. cryogenic quantum systems)
- Massive parallelism using naturally occurring fluctuations
- Intrinsic probabilistic reasoning - just like AI
- Orders-of-magnitude lower energy per operation
In effect, computation becomes a self-organising physical process - more akin to biology than mathematics.
Sources:
- Tsormpatzoglou et al., Thermodynamics-Inspired Computing with Oscillatory Neural Networks for Inverse Matrix Computation, arXiv:2507.22544v1, 2025.
- Melanson et al., Thermodynamic Computing System for AI Applications, Nature Communications, 2025. https://doi.org/10.1038/s41467-025-59011-x
Research
- DARPA Thermodynamic Computing (TC) – phase II results expected 2026
- University of Tokyo & RIKEN – spintronics-based Ising solvers 2024
- MemComputing Inc. – deterministic chaos computing prototypes
- IBM & Stanford – neuromorphic “liquid state” thermodynamic circuits
- Oxford / Cambridge – biological thermodynamics applied to active inference AI