Fisher Information
The Fisher Information I(θ) measures the amount of information that observable data carries about an unknown parameter θ.
Mathematical Definition:
$I(\theta) = \mathbb{E}\left[\left(\frac{\partial}{\partial\theta} \log L(\theta)\right)^2\right] = -\mathbb{E}\left[\frac{\partial^2}{\partial\theta^2} \log L(\theta)\right]$ where $L(\theta)$ is the likelihood function.
Use Cases
Variance of estimators
The variance of any unbiased maximum likelihood estimator. The Cramer-Rao bound states that Var(θ̂) ≥ 1/I(θ), where I(θ) is the Fisher Information. This provides a theoretical lower bound on estimator variance - no unbiased estimator can have variance smaller than the inverse of the Fisher Information. The bound only applies to unbiased estimators because biased estimators can achieve lower variance by accepting some bias (bias-variance tradeoff).