Information Theory
From Shannon's 1948 paper to quantum error correction — the mathematical theory of communication, compression, and the fundamental limits of knowledge.
Shannon's Communication Model
About This Course
In 1948, Claude Shannon published “A Mathematical Theory of Communication” — a paper that created an entirely new science. Shannon showed that information can be measured in bits, that every communication channel has a maximum rate at which information can be transmitted reliably, and that clever coding can approach this limit with vanishingly small error probability.
Information theory has since grown far beyond communication engineering. It now underpins data compression, cryptography, machine learning, statistical inference, thermodynamics, quantum computing, and even our understanding of black holes. The entropy formula\(H = -\sum p_i \log p_i\) is as fundamental to science as \(E = mc^2\).
This course covers the full sweep: from Shannon's original theorems through practical coding algorithms, continuous-channel theory, Kolmogorov complexity, the physics of information (Maxwell's demon, Landauer's principle, black holes), quantum information theory, and modern applications in AI and cryptography. Every chapter includes interactive Python simulations and detailed mathematical derivations.
The Central Equations
Shannon Entropy
\( H(X) = -\sum_{i=1}^{n} p_i \log_2 p_i \)
Channel Capacity
\( C = \max_{p(x)} I(X;Y) \)
Shannon-Hartley (AWGN)
\( C = B\log_2\!\left(1 + \frac{S}{N}\right) \)
Von Neumann Entropy
\( S(\rho) = -\text{Tr}(\rho \ln \rho) \)
Course Structure
Foundations
Shannon entropy, the source coding theorem, mutual information, and channel capacity — the four pillars of information theory.
Coding Theory
From Huffman trees to turbo codes — the algorithms that make digital communication and storage possible.
Continuous Channels
Differential entropy, the Gaussian channel, Shannon’s capacity formula, MIMO systems, and the water-filling solution.
Algorithmic Complexity
Kolmogorov complexity, the minimum description length principle, and deep connections to Gödel’s incompleteness and Turing’s halting problem.
Information & Physics
Maxwell’s demon, Landauer’s erasure principle, Bekenstein-Hawking entropy, and the black hole information paradox.
Quantum Information
Von Neumann entropy, quantum channels, the Holevo bound, and quantum error correction — the information theory of the quantum world.
Modern Applications
Data compression algorithms, the information bottleneck in deep learning, modern cryptography, and network information theory.
Timeline Highlights
Recommended Reading
- Elements of Information Theory — Thomas Cover & Joy Thomas (2nd ed., 2006)
- Information Theory, Inference, and Learning Algorithms — David MacKay (2003, free online)
- A Mathematical Theory of Communication — Claude Shannon (1948, original paper)
- Quantum Computation and Quantum Information — Nielsen & Chuang (2010)
- The Information — James Gleick (2011, popular history)