Fisher Information Matrix

From GM-RKB
Jump to navigation Jump to search

A Fisher Information Matrix is a matrix that … fisher information.



References

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Fisher_information#Matrix_form Retrieved:2015-6-25.
    • When there are N parameters, so that θ is a vector [math]\displaystyle{ \theta = \begin{bmatrix} \theta_{1}, \theta_{2}, \dots , \theta_{N} \end{bmatrix}^{\mathrm T}, }[/math] then the Fisher information takes the form of an matrix, the Fisher Information Matrix (FIM), with typical element : [math]\displaystyle{ {\left(\mathcal{I} \left(\theta \right) \right)}_{i, j} = \operatorname{E} \left[\left. \left(\frac{\partial}{\partial\theta_i} \log f(X;\theta)\right) \left(\frac{\partial}{\partial\theta_j} \log f(X;\theta)\right) \right|\theta\right]. }[/math] The FIM is a positive semidefinite symmetric matrix, defining a Riemannian metric on the N-dimensional parameter space, thus connecting Fisher information to differential geometry. In that context, this metric is known as the Fisher information metric, and the topic is called information geometry.

      Under certain regularity conditions, the Fisher Information Matrix may also be written as : [math]\displaystyle{ {\left(\mathcal{I} \left(\theta \right) \right)}_{i, j} = - \operatorname{E} \left[\left. \frac{\partial^2}{\partial\theta_i \, \partial\theta_j} \log f(X;\theta) \right|\theta\right]\,. }[/math] The metric is interesting in several ways; it can be derived as the Hessian of the relative entropy; it can be understood as a metric induced from the Euclidean metric, after appropriate change of variable; in its complex-valued form, it is the Fubini–Study metric.