Built for equilibrium, tasked with steady-state: How the theoretical foundations of molecular dynamics hold up
Molecular dynamics (MD) simulations are traditionally built on the ergodic hypothesis and the principles of equilibrium statistical mechanics. However, many real-world processes—such as water flowing through a membrane—occur under steady-state, non-equilibrium conditions. This raises a key question: how can MD, originally developed for equilibrium systems, yield reliable predictions for non-equilibrium steady states?
Let’s begin by reviewing the core principles of MD at equilibrium, and then examine how these adapt to non-equilibrium scenarios.
Equations of Motion: The MD engine
MD uses Newton’s laws of motion to simulate how atoms and molecules move over time:
\[F = ma \quad \text{or} \quad \frac{d\mathbf{q}}{dt} = \frac{\partial H}{\partial \mathbf{p}}, \quad \frac{d\mathbf{p}}{dt} = -\frac{\partial H}{\partial \mathbf{q}}\]Here, \(\mathbf{q}\) and \(\mathbf{p}\) are positions and momenta, and \(H\) is the Hamiltonian — the total energy of the system.
This microscopic, deterministic evolution fully specifies the positions and momenta of every atom at each instant, effectively tracing their trajectories through phase space. Each frame in the trajectory corresponds to a microstate—a complete point in the system’s high-dimensional phase space. Macroscopic properties such as temperature, pressure, and diffusion coefficients are computed by applying statistical mechanical averaging schemes to these microstates. For statistical mechanics to supply reliable predictions, the way systems explore their phase space must adhere to certain principles.
Equilibrium: The key to reliable predictions
Molecular dynamics simulations rely on the system reaching equilibrium — a balanced state where the material’s overall behavior becomes stable enough to predict its properties accurately. While molecules continue to move and chemical reactions may still occur, there is no net change in the macroscopic state variables. For example, in a chemical reaction at equilibrium, the forward and reverse reaction rates are equal. Although energy and particles constantly fluctuate at the microscopic level, there are no net macroscopic flows.
This equilibrium state is crucial because it allows us to apply statistical mechanics principles, which provide a framework for understanding how microscopic interactions lead to macroscopic properties.
The Boltzmann Distribution: Probability at equilibrium
At equilibrium, the system’s microscopic states are distributed according to well-defined statistical laws — laws that remain constant over time — resulting in a stable probability distribution over the possible configurations (microstates) explored by the system. The probability of finding the system in any microscopic state at equilibrium is set by the Boltzmann distribution: \(P(\mathbf{q}, \mathbf{p}) = \frac{1}{Z} e^{-\beta H(\mathbf{q}, \mathbf{p})}\)
where \(Z\) is the partition function, \(\beta = \frac{1}{k_B T}\), and \(k_B\) is Boltzmann’s constant, \(T\) is the absolute temperature, and \(H(\mathbf{q}, \mathbf{p})\) is the Hamiltonian of the system.
The key point here is that while individual molecules constantly move and interact, the overall statistical “recipe” (Boltzmann distribution) describing the system’s state remains unchanged. This fundamental property allows us to compute (constant) macroscopic properties like temperature, pressure, and chemical potential from (varying) microstates.
We should also note that the Boltzmann distribution isn’t an arbitrary mathematical convenience but emerges from fundamental principles of statistical mechanics. When a system is in contact with a heat bath at temperature T, the probability of finding it in any microstate with energy E is given by the Boltzmann factor, \(e^{-E/k_{B}T}\). This reflects the balance between energy and entropy: lower energy states are more probable, but higher energy states are also accessible due to thermal fluctuations.
Liouville’s Theorem: Phase-space preservation
How does the probability distribution over microscopic states remain stable as the system evolves dynamically? What if the dynamics (coming from numerical integration) somehow bias certain states or distort the probabilities we’re supposed to sample?
To understand this concern, imagine an ensemble of many molecular dynamics (MD) simulations run simultaneously—each differing slightly in the initial atomic positions or velocities. Together, these systems form a “cloud” of points in phase space, which is the multidimensional space of all possible atomic positions and momenta, moving as simulation progress.
Liouville’s theorem states that the phase-space volume occupied by these states remains constant as they evolve:
\[\frac{d\rho}{dt} = 0 \quad \text{along a trajectory}\]where $\rho(\mathbf{q}, \mathbf{p}, t)$ is the probability density in phase space.
A helpful analogy is kneading dough: you can stretch, fold, or twist it, but its total volume doesn’t change. Similarly, the “cloud” of points representing system states may deform in shape, but its overall volume in phase space is conserved.
This preservation is crucial because molecular dynamics simulations fundamentally depend on maintaining the correct statistical distribution over microscopic states. The entire reliability of MD predictions rests on the assumption that our simulation samples from the same probability distribution that governs the real physical system at equilibrium—the Boltzmann distribution.
To see why this matters for actual property calculations, consider how we compute the equilibrium temperature. The ensemble average temperature in equilibrium is defined as:
\[\langle T \rangle_{ensemble} = \int d\mathbf{q} \, d\mathbf{p} \; \rho_{eq}(\mathbf{q}, \mathbf{p}) \, T(\mathbf{p})\]where \(\rho_{eq}\) is the Boltzmann distribution and \(T(\mathbf{p}) = \frac{1}{3Nk_B} \sum_{i=1}^{N} \frac{p_i^2}{m_i}\).
In molecular dynamics, we estimate this using a time average along our trajectory:
\[\langle T \rangle_{time} = \lim_{t \to \infty} \frac{1}{t} \int_0^{t} T(\mathbf{p}(t')) \, dt'\]Why should these two averages be equal? The key insight is that temperature is a weighted average of kinetic energies, where the weights come from how much time the simulation spends in different momentum states. Without Liouville’s theorem, the probability density $\rho(q,p,t)$ describing our system’s statistical state could evolve arbitrarily during the simulation. If the dynamics somehow caused probability inflation in high-momentum regions (more time spent there than equilibrium dictates), the time average would overweight high-energy configurations, yielding $\langle T \rangle_{time} > \langle T \rangle_{ensemble}$. Conversely, probability depletion in high-energy regions would systematically underestimate temperature.
Liouville’s theorem prevents this corruption by ensuring that probability density is conserved as it flows through phase space—like an incompressible fluid. If a region initially contains a certain “amount” of probability, it will retain exactly that amount as it evolves, even as its shape changes dramatically. This means that if your simulation starts with the correct equilibrium distribution, the statistical weights that determine ensemble averages are preserved during time evolution. The dynamics cannot create artificial concentrations or depletions of probability that would bias your computed properties.
Now, how do MD engines ensure this preservation? They use symplectic integrators (like velocity-Verlet or leapfrog algorithms) that exactly preserve the phase-space volume at each timestep. These integrators are specifically designed to maintain the geometric structure of Hamiltonian dynamics, ensuring that \(\frac{d\rho}{dt} = 0\) is satisfied numerically. This guarantees that the time average faithfully represents the equilibrium ensemble average…provided that the phase space is explored sufficiently.
The Ergodic Hypothesis and chaos: One trajectory is enough
How do you ensure one simulation samples sufficiently widely, instead of getting stuck? The ergodic hypothesis provides the answer:
- If you simulate a system long enough, it will visit all the relevant parts of its phase space.
- The time average of a quantity (like energy or density) becomes equal to the ensemble average over many systems:
Liouville’s theorem guarantees that the phase-space volume is preserved along trajectories, but to extract meaningful averages, the system’s trajectory must explore phase space thoroughly. This exploration comes from ergodicity, enabled by chaotic dynamics.
One way to quantify chaos is through the Lyapunov exponent:
\[\lambda = \lim_{t \to \infty} \frac{1}{t} \ln\left(\frac{\delta(t)}{\delta_0}\right)\]where \(\delta(t)\) and \(\delta_0\) are the separations between two trajectories at time \(t\) and initially, respectively.
In practice, positive Lyapunov exponents indicate that the system’s dynamics are chaotic, effectively erasing memory of initial conditions over time. This chaotic behavior acts like a mixing spoon, continuously stirring the system so that it explores its accessible phase space thoroughly and efficiently. Such thorough exploration is essential for the reliability of molecular dynamics: it ensures that time averages converge to meaningful, ensemble-representative values.
Detailed Balance: The signature of equilibrium
Simply sampling all accessible states is not sufficient to guarantee equilibrium. A fundamental microscopic criterion must also be met — the condition of detailed balance. This principle asserts that at equilibrium, every microscopic process is exactly balanced by its reverse process, occurring with equal probability and rate. It is this precise balance of forward and backward transitions that underpins the stationarity of the Boltzmann distribution and the stability of equilibrium.
Mathematically, for any two microscopic states ( i ) and ( j ): \(P_\text{eq}(i) W_{ij} = P_\text{eq}(j) W_{ji}\)
where \(P_\text{eq}\) is the equilibrium probability distribution and \(W_{ij}\) is the transition rate from state ( i ) to state ( j ).
Imagine a molecular “turnstile”—for every particle crossing from left to right, another heads right to left, perfectly balancing the books. One nice visualization of detailed balance is available here (Credits to @keenanisalive).
Detailed balance ensures there are no net currents at equilibrium. It’s this invisible equilibrium bookkeeping that makes the time averages converge to the ensemble averages, allowing us to compute stable macroscopic properties like temperature, pressure, and chemical potential.
The Paradox: MD Needs Equilibrium — But We Want to Simulate Flow!
Molecular dynamics (MD) simulations fundamentally rely on equilibrium principles—such as detailed balance and ergodicity—that ensure the microscopic configurations sampled reflect macroscopic thermodynamic behavior reliably.
However, many real-world processes—like reverse osmosis driven by pressure gradients—occur far from equilibrium, in what is known as a steady state. While steady states exhibit macroscopic properties that remain constant over time, they often involve continuous net flows of matter or energy. This happens when inflows and outflows balance perfectly, so the system neither accumulates nor loses mass or energy overall. Unlike equilibrium, steady states sustain directional fluxes and processes, reflecting microscopic imbalances. For example, in reverse osmosis, water molecules flow persistently from regions of high to low concentration across a membrane, driven by external pressure. Although the system reaches a steady-state—with stable macroscopic properties like density and flux—it is not at equilibrium because of the net flow.
In this context, two key assumptions of equilibrium break down:
- Detailed balance: There are net flows of particles or energy, so the forward and reverse processes do not balance out. \(\sum_j W_{ij} P_i \neq \sum_j W_{ji} P_j\)
- Boltzmann distribution: The probability distribution the system samples from is no longer the familiar equilibrium form. \(P \neq \frac{1}{Z} e^{-\beta H}\)
Instead, the system settles into a new, steady-state distribution that depends on the ongoing flows and driving forces.
Despite this, much of the other foundations of MD survives:
- Liouville’s theorem continues to hold because Newtonian (Hamiltonian) dynamics remain unchanged, so phase-space volume is conserved.
- Lyapunov exponents remain positive, signaling ongoing chaos and ensuring the system does not languish in a narrow subset of states but continues exploring its accessible phase space.
- Ergodicity and mixing can still hold: as long as the system is chaotic and mixing well, time averages computed in simulation remain meaningful, now reflecting the new steady-state distribution rather than the equilibrium one.
How do we reconcile these changes? The key is to shift focus: instead of assuming the system samples from equilibrium (Boltzmann) distributions governed by detailed balance, we embrace the fact that under steady, ongoing driving, the simulation explores a non-equilibrium steady-state (NESS) distribution. This distribution is stationary—its statistical properties do not change over time, even though detailed balance and Boltzmann statistics no longer apply:
\[\delta P_{NESS}/dt = 0\]meaning statistical properties do not change with time, allowing stable, reproducible averages to be computed.
For example, in reverse osmosis, water driven across a membrane no longer obeys detailed balance or the equilibrium distribution: the forward flux of water exceeds the backward rate, creating a persistent flux, J.
But this flux, J, stabilizes at a constant value over time, marking the system’s attainment of a non-equilibrium steady state. Achieving this steady state signifies that the microscopic dynamics have thoroughly “mixed” the system under the sustained driving conditions, preserving ergodic behavior despite the continuous external forcing. In other words, the system continues to robustly explore its accessible phase space—chaotic dynamics remain active, as reflected by positive Lyapunov exponents—and the probability distribution over states becomes stationary (time-invariant), although it no longer corresponds to equilibrium.
This persistent mixing and ergodicity ensure that time averages of observables, such as flux, converge to stable, physically meaningful values. As a result, even far from equilibrium, one can confidently analyze such steady fluxes to extract fundamental transport properties—like permeability or diffusion coefficients—with the same rigorous reliability rooted in the core theoretical framework of molecular dynamics.
Conclusion
In summary, while molecular dynamics simulations are built on the foundations of equilibrium statistical mechanics, they can still yield reliable predictions for non-equilibrium steady states. The key lies in recognizing that even when detailed balance and Boltzmann statistics do not apply, the principles of ergodicity, Liouville’s theorem, and chaotic dynamics remain intact. These principles ensure that the system explores its accessible phase space thoroughly, allowing for stable time averages that reflect the macroscopic properties of the non-equilibrium steady state.