Classical statistical mechanics

Statistical mechanics provides the fundamental bridge between microscopic models of matter and macroscopic thermodynamic behavior. In this chapter, we introduce the key concepts of classical statistical mechanics that underpin all sampling methods discussed in this series — including nested sampling. We will see how probability distributions on phase space give rise to thermodynamic ensembles, how partition functions encode all equilibrium thermodynamics, and how response functions signal phase transitions.

Phase space and the Hamiltonian

The microstate of a classical system of particles is specified by their positions and conjugate momenta . The full state space is called phase space, and it is equipped with a natural reference measure — the Liouville measure

which is preserved under Hamiltonian time evolution (Liouville's theorem).

The energetics of the system are encoded in the Hamiltonian

which separates into a kinetic energy (depending only on momenta) and a potential energy (depending only on positions). This additive structure is crucial: it means we can often integrate out the momenta analytically and work entirely in configuration space — the space of positions alone.

The Boltzmann distribution

The central object of equilibrium statistical mechanics is the probability density on phase space. For a system in thermal contact with a heat bath at temperature , the probability of finding the system in a particular microstate is given by the Boltzmann distribution

where is the inverse temperature (in natural units where ). This is the canonical ensemble.

The Boltzmann factor assigns exponentially higher probability to low-energy states. The temperature controls how sharply the distribution is peaked:

Let's visualize this on a 2D potential energy surface. We use Hosaki's function, which has a local minimum at and a global minimum at . Drag the temperature slider and observe how the Boltzmann probability density redistributes.

At very low temperatures, the probability is concentrated almost entirely in the global minimum at . As you increase , the local minimum at becomes populated too, and eventually the distribution becomes nearly flat — the system explores the full configuration space.

Partition functions

The Boltzmann distribution must be normalized to define a proper probability density. The normalization constant is the partition function, which for the canonical (NVT) ensemble reads

Thanks to the additive structure of the Hamiltonian , the Gaussian integral over the momenta can be carried out explicitly. This yields a factorization

where collects kinetic and quantum-mechanical constants, and

is the configurational partition function. This is the quantity that matters for sampling methods like nested sampling, which operate entirely in configuration space.

The corresponding configurational probability density is simply

The NVT ensemble: Helmholtz free energy

The logarithm of the partition function defines the Helmholtz free energy

which encodes all equilibrium thermodynamics of the canonical ensemble. This is a remarkable statement: if we know as a function of its natural variables, we can derive every thermodynamic observable by differentiation.

The power of this becomes clear when we write out the derivatives explicitly — each macroscopic quantity corresponds to a microscopic average over the Boltzmann distribution:

This is the bridge between phenomenological thermodynamics and microscopic statistical mechanics. The average energy is the expectation of under ; the entropy splits into a phase-space volume term and an energetic contribution ; and the heat capacity measures the variance of the energy — the macroscopic thermal response is determined entirely by the magnitude of microscopic energy fluctuations at equilibrium.

For our 2D Hosaki potential, the configurational partition function takes the explicit form

which we evaluate numerically on a grid. The plots below show , , , and as functions of temperature. The dashed line marks the current temperature from the Boltzmann visualization above.

Notice the peak in : it signals a temperature where the system undergoes a qualitative change — a phase transition. For Hosaki's function, this peak corresponds to the temperature where the system transitions from being localized in the global minimum to populating both minima. As we derived above, is proportional to the energy variance — so the peak marks the temperature of maximal energy fluctuations, where the system is "torn" between competing configurations.

The NPT ensemble: Gibbs free energy

In many practical situations — particularly in materials science — both temperature and pressure are controlled externally. This defines the isothermal-isobaric (NPT) ensemble. The relevant energy becomes the enthalpy , and the partition function generalizes to

The corresponding thermodynamic potential is the Gibbs free energy

which is related to the Helmholtz free energy by a Legendre transformation: . Just as in the NVT case, derivatives of connect macroscopic thermodynamics to microscopic averages:

Again, each thermodynamic derivative has a direct microscopic interpretation as a statistical average — the average volume, the average enthalpy, and the enthalpy variance — computed over the Boltzmann distribution of the NPT ensemble.

The toy model

To illustrate the NPT ensemble, we use a toy model introduced in [Ref. 1]: two particles in a one-dimensional periodic box, interacting via a pair potential with a repulsive core and an attractive well. The system is fully described by the interparticle distance and the box length (which plays the role of volume ). Due to periodic boundary conditions, only the region is physical. The NPT partition function for this model is

where the triangular integration domain reflects the periodic boundary constraint.

The enthalpy surface depends on the applied pressure. Increasing tilts the surface, penalizing large volumes and favoring compact configurations.

Now, what does the Boltzmann distribution look like on this enthalpy surface? At low temperature, probability concentrates in the enthalpy minimum. As temperature increases, the distribution spreads. Try adjusting both pressure and temperature to see how the probability shifts between compact (small ) and expanded (large ) configurations.

NPT thermodynamic quantities

We can now evaluate all the derivatives introduced above for the toy model. The plots below show the Gibbs free energy, average volume, heat capacity , and entropy as functions of temperature at the current pressure.

Response functions and phase transitions

Phase transitions correspond to non-analytic behavior of thermodynamic potentials in the thermodynamic limit. For finite systems — like our toy model — these singularities are rounded, manifesting instead as pronounced peaks in response functions.

Key insight: Peaks in or signal enhanced fluctuations — the system is torn between competing configurations (phases). This is the hallmark of a phase transition.

The heat capacity is a second derivative of the Gibbs free energy. More generally, all response functions share this structure:

Response function Potential Expression

This structure can be understood through a Taylor expansion: the thermodynamic potential around equilibrium looks like

The quadratic term encodes the energetic cost of perturbations. When it diverges — when the curvature of the free-energy surface changes dramatically — we are at a phase transition.

Pressure dependence of the phase transition

Try different pressures in the toy model above and observe how the peak shifts. At low pressure, the system transitions from a compact to an expanded phase at relatively low temperature. As pressure increases, the transition shifts to higher temperatures — more thermal energy is needed to overcome the pressure favoring the compact state.

Connection to nested sampling

Nested sampling provides a unique approach to statistical mechanics: rather than sampling at a fixed temperature, it reconstructs the full free-energy landscape from a single simulation. By systematically compressing the configuration space — removing high-energy states one at a time — NS directly computes the partition function as a function of , giving access to all temperatures simultaneously.

This makes NS particularly powerful for studying phase transitions: the peaks in response functions, the competing free-energy sheets of different phases, and the complete equation of state all emerge naturally from a single NS run. We will explore this in detail in the nested sampling primer.