Module 0 · Foundations
Introduction & Motivation
Course Rationale
Classical molecular dynamics relies on force-field evaluations that are either physically approximate (empirical) or prohibitively expensive (ab initio). This course develops a unified framework in which meta-learning — learning to learn — serves simultaneously as a computational accelerator for force-field adaptation and as a physically interpretable model for non-Markovian protein dynamics.
1. The Central Problem
Protein simulations sit at the intersection of three hard problems:
- Accurate energetics require QM — prohibitively expensive at protein scale. DFT scales as O(N3) with system size; CCSD(T) is O(N7).
- Each new protein is effectively a new system — learned force fields rarely transfer. A neural network trained on one cofactor-binding pocket fails on the next.
- Quantum nuclear effects (tunneling, ZPE) are neglected by classical dynamics, despite being decisive for hydrogen-transfer reactions, where they can change rates by orders of magnitude.
Neural network potentials (NNPs) address Problem I by fitting DFT-quality force fields at near-classical cost. But they remain data-hungry — typically requiring thousands of DFT single-points per system. For novel proteins, rare folds, or enzyme active sites with unusual electronic structure, this data requirement is prohibitive.
Model-Agnostic Meta-Learning (MAML) addresses Problem II by learning an initialisation from which rapid adaptation to any new system requires only O(10–50) DFT calculations. This course develops MAML rigorously, then demonstrates its physical interpretation in terms of the Nakajima–Zwanzig memory kernel and the cusp catastrophe universality class. Problem III is addressed via Ring Polymer Molecular Dynamics (RPMD) run on the MAML-adapted potential.
2. Learning Objectives
- Derive the MAML bi-level optimisation and identify its second-order gradient structure.
- Understand E(3)-equivariant message-passing networks (NequIP, MACE) and their data-efficiency advantage.
- Implement the full MAML + NequIP meta-training and task-adaptation pipeline.
- Derive the Nakajima–Zwanzig generalised master equation via the Zwanzig projection-operator formalism.
- Establish the formal correspondence between MAML inner-loop adaptation and the NZ memory integral.
- Apply cusp catastrophe theory to identify bifurcations in the MAML loss landscape.
- Understand quantum tunneling corrections (Bell, Wigner) and their relationship to kinetic isotope effects.
- Set up and analyse RPMD simulations for H/D transfer rate calculations.
- Interpret geometric scaffold rigidity as a modulator of tunneling contribution and effective memory depth.
3. Prerequisites
Mathematics
Linear algebra (eigendecomposition, tensor products), multivariable calculus (gradient, Hessian, chain rule through compositions), basic differential equations (ODEs, Volterra integral equations), elementary group theory (rotation group SO(3), irreducible representations). Catastrophe theory is helpful but developed from scratch in Module 6.
Physics
Statistical mechanics (partition functions, free energy, transition-state theory), quantum mechanics (WKB approximation, path integrals at the level of Feynman–Hibbs), basic density-matrix formalism. Experience with molecular dynamics (Verlet integration, thermostatting) is assumed. Quantum field theory is not required.
Computation
Python proficiency (NumPy, PyTorch autograd), familiarity with a molecular-dynamics engine (GROMACS, OpenMM, or LAMMPS), basic experience with electronic structure codes (ORCA, Gaussian, or CP2K). Prior experience with neural network potentials is helpful but not required — the NequIP architecture is developed from first principles in Module 2.