The Microcanonical Ensemble
Entropy as the logarithm of the number of microstates, equal a priori probability, and the foundation of statistical mechanics for isolated systems
Historical Context
Ludwig Boltzmann, working in the 1870s, forged the revolutionary connection between the macroscopic quantity entropy and the microscopic count of accessible states. His tombstone in Vienna bears the inscription \(S = k \ln W\), arguably the most profound equation in all of physics. J. Willard Gibbs independently developed the ensemble formalism in his 1902 masterwork “Elementary Principles in Statistical Mechanics,” coining the term “microcanonical ensemble” for an isolated system at fixed energy.
Max Planck later refined Boltzmann's formula and introduced the constant \(k_B\)that now bears Boltzmann's name. The microcanonical ensemble remains the conceptual bedrock upon which all of statistical mechanics is built.
1. The Fundamental Postulate of Statistical Mechanics
Consider an isolated system with fixed energy \(E\), volume \(V\), and particle number \(N\). The fundamental postulate (equal a priori probability) states:
Equal A Priori Probability Postulate
For an isolated system in equilibrium, all accessible microstates consistent with the macroscopic constraints are equally probable. If \(\Omega(E, V, N)\) is the total number of microstates with energy in \([E, E + \delta E]\), then each microstate has probability:
This postulate cannot be derived from mechanics alone; it is an additional axiom motivated by ergodic theory and time-reversal symmetry. It implies that the system spends equal time in each accessible microstate over long observation periods.
The density matrix for the microcanonical ensemble is:
where the sum runs over all eigenstates \(|i\rangle\) of the Hamiltonian with energies in the specified shell.
2. Boltzmann Entropy: \(S = k_B \ln \Omega\)
Derivation 1: Entropy from Information Theory
The Shannon entropy for any probability distribution \(\{p_i\}\) is:
For the microcanonical ensemble where \(p_i = 1/\Omega\) for all accessible states:
Derivation 2: Entropy Maximization
We can derive the equal probability postulate by maximizing entropy subject to the normalization constraint \(\sum_i p_i = 1\). Using Lagrange multipliers:
Since all probabilities are equal and must sum to 1, we get \(p_j = 1/\Omega\). This proves that the uniform distribution maximizes entropy among all distributions over \(\Omega\) states.
Key Insight: Additivity of Entropy
For two independent subsystems A and B, the total number of microstates is\(\Omega_{AB} = \Omega_A \cdot \Omega_B\). The logarithm converts this product into a sum:
This extensivity is the reason Boltzmann chose the logarithm. It ensures that entropy is an extensive (additive) thermodynamic quantity.
3. The Ideal Gas: Counting Microstates
Derivation 3: Volume of a Hypersphere in Phase Space
For \(N\) non-interacting classical particles in a box of volume \(V\), the phase space is \(6N\)-dimensional. The constraint \(\sum_{i=1}^{N} \frac{p_i^2}{2m} = E\)defines a hypersphere of radius \(R = \sqrt{2mE}\) in the \(3N\)-dimensional momentum space. The volume of a \(d\)-dimensional sphere of radius \(R\) is:
For our \(3N\)-dimensional momentum sphere with \(d = 3N\):
where \(\Phi(E)\) is the total phase space volume up to energy \(E\). The number of microstates in the energy shell is:
The factor \(1/N!\) accounts for the indistinguishability of identical particles (Gibbs correction), and \(h^{3N}\) sets the quantum of phase space volume.
Derivation 4: The Sackur-Tetrode Equation
Taking the logarithm and using Stirling's approximation \(\ln N! \approx N \ln N - N\):
This is the celebrated Sackur-Tetrode equation. Using\(E = \frac{3}{2}Nk_BT\), we can rewrite it as:
where \(\lambda_{dB} = h/\sqrt{2\pi m k_B T}\) is the thermal de Broglie wavelength. This result was verified experimentally for noble gases with remarkable precision.
4. Temperature and the Laws of Thermodynamics
Derivation 5: Temperature from Entropy
The microcanonical definition of temperature emerges from requiring thermal equilibrium between two subsystems A and B in thermal contact, with fixed total energy \(E = E_A + E_B\):
Maximizing \(\Omega_{\text{total}}\) (or equivalently \(S_{\text{total}}\)) with respect to \(E_A\):
This motivates the definition of temperature:
Similarly, pressure and chemical potential emerge as:
The Fundamental Thermodynamic Relation
Combining these definitions gives the fundamental relation:
or equivalently \(dE = TdS - PdV + \mu dN\), which is the first law of thermodynamics.
5. Applications and Examples
Two-Level System
Consider \(N\) independent spins, each with energy 0 or \(\epsilon\). If the total energy is \(E = n\epsilon\) (i.e., \(n\) spins are excited), the number of microstates is:
The entropy per particle in the thermodynamic limit using Stirling's approximation:
The temperature is:
Remarkably, when \(x > 1/2\) (more than half the spins excited), the temperature becomes negative. This is a genuine physical effect observed in nuclear spin systems (Purcell and Pound, 1951).
Einstein Solid
For \(N\) quantum harmonic oscillators sharing \(q\) energy quanta, the number of microstates is the “stars and bars” combinatorial result:
Harmonic Oscillators and Equipartition
For \(N\) classical harmonic oscillators with total energy \(E\), the phase space volume leads to:
Taking the temperature derivative: \(1/T = Nk_B/E\), giving \(E = Nk_BT\), which is the classical equipartition result of \(k_BT/2\) per quadratic degree of freedom (here \(2N\) quadratic degrees).
6. The Gibbs Paradox
Without the \(1/N!\) factor, the entropy of an ideal gas would not be extensive. Mixing two identical gases by removing a partition would yield an entropy increase:
The Gibbs correction \(\Omega \to \Omega/N!\) resolves this paradox by accounting for the indistinguishability of identical particles. With this correction:
This was one of the earliest hints that identical particles require fundamentally different counting than distinguishable objects, foreshadowing quantum mechanics.
7. Equivalence of Ensembles
In the thermodynamic limit \(N \to \infty\), all three major ensembles (microcanonical, canonical, grand canonical) give identical results for thermodynamic quantities. This is because the energy distribution in the canonical ensemble becomes sharply peaked around \(\langle E \rangle\) with relative fluctuations:
For macroscopic systems with \(N \sim 10^{23}\), this ratio is negligible (\(\sim 10^{-12}\)), making the ensembles effectively equivalent.
8. Computational Exploration
The following simulation explores the microcanonical ensemble for a two-level system and the ideal gas, computing the entropy, temperature, and demonstrating the approach to thermodynamic equilibrium.
Microcanonical Ensemble: Two-Level System, Einstein Solid, and Thermal Equilibrium
PythonClick Run to execute the Python code
Code will be executed with Python 3 on the server
9. Summary and Key Results
Core Formulas
- Boltzmann entropy: \(S = k_B \ln \Omega\)
- Equal probability: \(p_i = 1/\Omega\)
- Temperature: \(1/T = \partial S/\partial E\)
- Sackur-Tetrode: \(S = Nk_B[\frac{5}{2} + \ln(V/N\lambda_{dB}^3)]\)
Physical Insights
- Entropy maximization implies equal a priori probability
- The \(1/N!\) Gibbs factor resolves the mixing paradox
- Negative temperatures arise in bounded energy systems
- Ensembles become equivalent as \(N \to \infty\)