Molecular Dynamics

In the statistical mechanics chapter, we introduced the Boltzmann distribution and showed how the partition function encodes all equilibrium thermodynamics. In the MCMC chapter, we saw that Hamiltonian Monte Carlo uses leapfrog integration to propose distant states with high acceptance probability — and noted that those trajectories look suspiciously like a physics simulation. In this chapter, we make that connection precise.

Molecular dynamics (MD) is a deterministic method for simulating the time evolution of a physical system. Rather than proposing random moves and accepting or rejecting them, MD directly integrates Newton's equations of motion. This produces a trajectory through phase space — the system evolves in time, and thermodynamic observables emerge as time averages along the trajectory.

We will see how the Hamiltonian formalism from the statistical mechanics chapter translates into equations of motion, how to integrate them numerically while preserving the structure of phase space, and how coupling the system to a heat bath (thermostatting) produces samples from the Boltzmann distribution.

Hamilton's equations of motion

Recall from the statistical mechanics chapter that the state of a classical system is described by positions and momenta , and the dynamics are governed by the Hamiltonian

Hamilton's equations give the time evolution of the system:

The first equation says that velocity is momentum divided by mass. The second is Newton's second law: the force on each particle equals the negative gradient of the potential energy. Together, they define a flow on phase space — given an initial condition , the system follows a unique trajectory.

Key property: Hamiltonian dynamics conserves the total energy and preserves the phase-space volume (Liouville's theorem). These properties are crucial — they ensure that the dynamics does not artificially compress or expand regions of phase space, and that the trajectory stays on a surface of constant energy.

Numerical integration: the velocity-Verlet algorithm

In practice, Hamilton's equations cannot be solved analytically for interesting systems. We discretize time into steps of size and use a numerical integrator to advance the state. The standard choice in MD is the velocity-Verlet algorithm:

The velocity-Verlet algorithm:

Given positions , velocities , and forces :

  1. Half-step velocity update:
  2. Full position update:
  3. Compute new forces:
  4. Complete velocity update:

This is the same leapfrog integrator we encountered in HMC, just written in a different form. It has a crucial property: it is symplectic, meaning it exactly preserves the phase-space volume at each step. While it does not conserve exactly, the energy error remains bounded and oscillates around zero — there is no systematic energy drift, even over very long simulations.

NVE molecular dynamics

When we integrate Hamilton's equations without any external coupling, the system evolves at approximately constant energy. This is the microcanonical (NVE) ensemble — fixed particle number , volume , and energy .

Click anywhere on the energy landscape below to launch an NVE trajectory. The particle starts at the clicked position with random initial velocities. Watch how it rolls downhill, accelerates through minima, and decelerates as it climbs toward higher-energy regions — exactly like a ball on a surface.

Energy conservation

The hallmark of a good NVE integrator is that it conserves total energy. The plot below shows the potential energy (blue), kinetic energy (orange), and total energy (green) along the trajectory you just launched. Click on the landscape above to update.

Notice how kinetic and potential energy oscillate as the particle moves through minima and over barriers — energy is constantly exchanged between the two. But the total energy remains nearly constant, fluctuating only within a narrow band set by the integrator error. This is the symplecticity of velocity-Verlet at work.

Try increasing the time step above. At some point the energy error becomes large and the trajectory becomes unstable — the integrator "explodes". This critical time step depends on the curvature of the potential: steep, narrow wells require smaller for stability.

The microcanonical ensemble and the sampling problem

A long NVE trajectory samples the microcanonical ensemble — the system visits all states compatible with its total energy . This is useful in some contexts, but it is not the same as sampling the canonical (NVT) Boltzmann distribution

that we introduced in the statistical mechanics chapter. In the canonical ensemble, the system can exchange energy with a heat bath, and the probability of visiting a configuration depends on its energy and the temperature. To sample the Boltzmann distribution with MD, we need to couple the system to a thermostat.

The Andersen thermostat

The simplest thermostat is the Andersen thermostat. To understand why it works, recall from the statistical mechanics chapter that the full canonical distribution on phase space factorizes:

The momentum part is a product of independent Gaussians — the Maxwell-Boltzmann distribution:

where is the number of degrees of freedom. This distribution has a well-known consequence — the equipartition theorem: each quadratic degree of freedom carries an average kinetic energy of , so

The Andersen thermostat exploits this factorization. At random times, it interrupts the deterministic Hamiltonian dynamics and replaces the particle's velocity with a fresh draw from the Maxwell-Boltzmann distribution at temperature . Between collisions, the dynamics is purely Hamiltonian and preserves the energy surface; at each collision, the kinetic energy is reset to a value drawn from the correct thermal distribution.

The Andersen thermostat:

At each time step with probability :

  1. Replace all velocity components with fresh samples from

where is the collision frequency controlling how often the thermostat acts.

Why does this produce the correct canonical distribution? The argument has two parts. First, Hamiltonian dynamics preserves the canonical distribution — if is distributed according to , then so is the time-evolved state, because is conserved and the flow preserves phase-space volume. Second, the velocity resampling step independently draws momenta from , which is the correct marginal. Since both steps leave the canonical distribution invariant, their alternation does too — and the stochastic collisions ensure ergodicity by allowing the system to jump between different energy surfaces.

The collision frequency interpolates between two limits:

In practice, is chosen to be large enough that the system equilibrates at the target temperature, but small enough that the trajectory retains meaningful dynamical information between collisions.

NVT molecular dynamics

Now let us see the thermostat in action. The visualization below runs NVT MD with the Andersen thermostat. Click on the landscape to start a trajectory — notice how the particle's behavior changes qualitatively compared to NVE. The occasional velocity resampling events allow the particle to gain or lose energy, enabling it to cross barriers and explore the full landscape at temperature .

Energy fluctuations under the thermostat

Unlike NVE, the total energy now fluctuates — each thermostat collision injects or removes kinetic energy. The plot below shows the energy traces for the trajectory you just launched. Click on the landscape above to update.

Compare this to the NVE trace above: the total energy (green) now wanders freely, driven by the thermostat collisions. The potential energy distribution reflects the Boltzmann weight at the target temperature — at low , the system stays near the minima; at high , it explores broadly.

Try adjusting the collision rate . At , this reduces to NVE dynamics and the total energy is conserved. As increases, the energy fluctuations become more rapid and the trajectory thermalizes faster.

From trajectories to the Boltzmann distribution

The fundamental promise of NVT MD is the ergodic hypothesis: time averages along a sufficiently long trajectory equal ensemble averages over the Boltzmann distribution. In the language of the statistical mechanics chapter, this means

for any observable . The left side is what MD computes; the right side is the statistical-mechanical expectation we derived from the partition function.

Let us verify this directly. The plot below compares two things: the analytical Boltzmann distribution (as contour fill), and the positions visited by a long NVT MD trajectory (as scatter points). If the thermostat is working correctly, the sampled points should be dense where is large and sparse where is small.

The left panel shows the analytical Boltzmann distribution , and the right panel shows the density of positions visited by the MD trajectory. At low temperature, both concentrate around the global minimum at . As you increase , both spread out — and the MD samples faithfully reproduce the analytical distribution. This is the ergodic hypothesis in action.

NPT molecular dynamics: the barostat

So far we have sampled the canonical (NVT) ensemble at fixed volume. In many applications — particularly in materials science — we want to control pressure instead. This is the isothermal-isobaric (NPT) ensemble from the statistical mechanics chapter, where the volume becomes a dynamical degree of freedom.

Making volume dynamical

In the NVT ensemble, we sample configurations weighted by the Boltzmann factor . In the NPT ensemble, the volume joins the configuration space, and the relevant energy becomes the enthalpy

The term penalizes large volumes — at high pressure, the system is pushed toward compact configurations. The NPT partition function is

and the corresponding probability density is .

To sample this distribution with MD, we need a barostat — a mechanism that couples the volume to a pressure bath, just as the thermostat couples velocities to a heat bath. The simplest approach, introduced by Andersen (1980), treats the volume as an additional degree of freedom with its own conjugate momentum and fictitious mass . The extended Hamiltonian becomes

and Hamilton's equations now include an equation of motion for the volume:

where is the internal (virial) pressure of the system. The volume expands when the internal pressure exceeds the external pressure, and contracts when it falls below — driving the system toward mechanical equilibrium.

The toy model revisited

For our toy model — two particles in a periodic box with interparticle distance and box length (playing the role of volume) — the enthalpy is simply

Since is already a configuration coordinate, we can sample the NPT ensemble by running thermostatted MD on the enthalpy surface directly — the volume dynamics emerge naturally. Click on the enthalpy landscape below to launch an NPT trajectory. Try different pressures and observe how the equilibrium shifts between expanded and compact configurations.

NPT energy traces

The enthalpy plays the same role in the NPT ensemble that plays in NVT. The plot below shows the potential energy, the contribution, and the total enthalpy along the trajectory. Click on the landscape above to update.

At low pressure, the system explores large volumes (large ) where the cost is small. Increasing the pressure tilts the enthalpy surface, confining the trajectory to compact configurations. The thermostat ensures that the trajectory samples the correct NPT Boltzmann distribution at each pressure.

Sampling the NPT Boltzmann distribution

Just as in the NVT case, we can verify that the thermostatted NPT trajectory produces the correct distribution. The panels below compare the analytical NPT Boltzmann density with the density of MD samples on the enthalpy surface.

Connection to Hamiltonian Monte Carlo

If you have read the MCMC chapter, you will have noticed a striking similarity between MD and HMC. This is no coincidence — HMC is MD, wrapped in a Metropolis accept/reject step.

The key differences are:

Molecular dynamics HMC
Momenta Evolve continuously Resampled fresh at each proposal
Trajectory One long continuous run Short segments ( leapfrog steps)
Acceptance None (deterministic) Metropolis criterion on
Energy errors Bounded (symplectic) Corrected by accept/reject
Ensemble NVE (without thermostat) Canonical (by construction)

In HMC, the accept/reject step corrects for the finite- integration error, guaranteeing exact sampling from the target distribution even with an imperfect integrator. MD with a thermostat achieves the same goal differently — the stochastic velocity resampling ensures ergodicity and the correct temperature, while the deterministic dynamics between collisions provides efficient exploration.

The common thread: Both MD and HMC exploit the geometry of the energy landscape through Hamiltonian dynamics. Rather than proposing random jumps (as in Metropolis-Hastings), they follow the natural flow of the system — accelerating downhill, decelerating uphill — to generate distant, low-energy configurations efficiently. This geometric awareness is what makes gradient-based sampling methods essential for high-dimensional systems.

In the context of nested sampling, the connection runs even deeper. The TE-HMC variant caps the kinetic energy to enforce a hard energy constraint . This is essentially constrained NVE dynamics — the system evolves on a surface of constant total energy, exploring all configurations accessible below the energy threshold. We will see in the nested sampling primer how this enables efficient likelihood-constrained prior sampling.