Tensor Calculus
Tensors are the mathematical objects that transform covariantly under coordinate transformations, making them ideal for formulating physical laws that hold in all reference frames.
1. Index Notation and Einstein Summation
Einstein Summation Convention
When an index appears twice in a single term (once as superscript, once as subscript), sum over all values:
Free vs Dummy Indices
Dummy indices: Appear twice, summed over (can be renamed)
Free indices: Appear once, must match on both sides of equation
Here $\nu$ is dummy (summed), $\mu$ and $\rho$ are free.
2. Vectors and Covectors
Contravariant Vectors (Upper Index)
Transform like differentials $dx^\mu$:
Example: velocity $u^\mu = dx^\mu/d\tau$, momentum $p^\mu$
Covariant Vectors (Lower Index)
Transform like gradients $\partial_\mu = \partial/\partial x^\mu$:
Example: gradient of scalar $\nabla_\mu\phi = \partial_\mu\phi$
Raising and Lowering Indices
The metric tensor $g_{\mu\nu}$ converts between upper and lower indices:
where $g^{\mu\nu}$ is the inverse metric: $g^{\mu\lambda}g_{\lambda\nu} = \delta^\mu_\nu$
3. General Tensors
Definition
A tensor of type $(p,q)$ has $p$ upper indices and $q$ lower indices:
Examples
- β’ Scalar (0,0): $\phi$ (invariant under transformations)
- β’ Vector (1,0): $V^\mu$
- β’ Covector (0,1): $\omega_\mu$
- β’ Metric (0,2): $g_{\mu\nu}$
- β’ Electromagnetic field (0,2): $F_{\mu\nu}$
- β’ Riemann tensor (1,3): $R^\rho_{\sigma\mu\nu}$
Tensor Rank
The rank is $p + q$. In $n$ dimensions, a rank-$r$ tensor has $n^r$ components. Example: in 4D spacetime, $R^\rho_{\sigma\mu\nu}$ has $4^4 = 256$ components (reduced to 20 by symmetries).
4. Tensor Operations
Addition
Tensors of the same type can be added component-wise:
Tensor Product (Outer Product)
Multiply all components of two tensors:
Creates a tensor of type $(p_1+p_2, q_1+q_2)$ from types $(p_1,q_1)$ and $(p_2,q_2)$.
Contraction
Sum over one upper and one lower index:
Reduces rank by 2. Example: trace of matrix $\text{tr}(M) = M^\mu_\mu$
Symmetrization and Antisymmetrization
$$T_{[\mu\nu]} = \frac{1}{2}(T_{\mu\nu} - T_{\nu\mu}) \quad \text{(antisymmetric part)}$$
5. The Metric Tensor
Definition and Properties
The metric $g_{\mu\nu}$ defines the geometry of spacetime. It is:
- β’ Symmetric: $g_{\mu\nu} = g_{\nu\mu}$
- β’ Non-degenerate: $\det(g) \neq 0$
- β’ Defines inner product: $V \cdot W = g_{\mu\nu}V^\mu W^\nu$
Line Element
Infinitesimal proper time/distance:
Minkowski Metric
Flat spacetime in special relativity:
Inverse Metric
where $\delta^\mu_\nu$ is the Kronecker delta (1 if $\mu=\nu$, 0 otherwise).
6. Covariant Derivative
Why We Need It
Ordinary partial derivatives $\partial_\mu V^\nu$ don't transform as tensors in curved space. We need a covariant derivative that does.
Definition for Vectors
$$\nabla_\mu V_\nu = \partial_\mu V_\nu - \Gamma^\lambda_{\mu\nu}V_\lambda$$
where $\Gamma^\lambda_{\mu\nu}$ are Christoffel symbols (connection coefficients).
General Tensors
For a general tensor, add one $+\Gamma$ term for each upper index and one $-\Gamma$ term for each lower:
Christoffel Symbols
In terms of the metric (Levi-Civita connection):
Symmetric in lower indices: $\Gamma^\lambda_{\mu\nu} = \Gamma^\lambda_{\nu\mu}$
7. Parallel Transport and Geodesics
Parallel Transport
A vector $V^\mu$ is parallel transported along a curve $x^\mu(\lambda)$ if:
Expanding:
Geodesic Equation
The straightest possible pathβa curve whose tangent vector is parallel transported:
Free particles follow geodesics in spacetime.
8. Important Tensors in Physics
Stress-Energy Tensor $T_{\mu\nu}$
Describes energy, momentum, and stress. For perfect fluid:
Electromagnetic Field Tensor $F_{\mu\nu}$
Riemann Curvature Tensor $R^\rho_{\sigma\mu\nu}$
Measures spacetime curvature:
9. Practical Tips
1. Check Index Balance
Every term in an equation must have the same free indices in the same positions.
2. Avoid Index Collisions
Don't use the same dummy index three or more times in one term.
3. Remember Symmetries
Use symmetry properties to reduce calculations: $g_{\mu\nu} = g_{\nu\mu}$, $F_{\mu\nu} = -F_{\nu\mu}$
4. Matrix vs. Tensor
Matrices are representations of (1,1) tensors in a specific basis. Tensor equations hold in all bases.