AIC Statistic

From GM-RKB
Jump to navigation Jump to search

An AIC Statistic is a statistical model selection information criterion based on Kullback-Leibler divergence.



References

2012

  1. Akaike (1974)

2004

  • (Burnham & Anderson, 2004) ⇒ Kenneth P. Burnham and David R. Anderson (2004). “Multimodel inference understanding AIC and BIC in model selection". Sociological methods & research, 33(2), 261-304. [doi:10.1177/0049124104268644]
    • (...) K is the asymptotic bias correction term and is in no way arbitrary (as is sometimes erroneously stated in the literature). Akaike (1973,1974) multiplied this simple but profound result by –2 (for “historical reasons”), and this became Akaike’s information criterion:

      AIC = −2 \log(L([math]\displaystyle{ \theta }[/math]|data))+ 2K.

      In the special case of least squares (LS) estimation with normally distributed errors, AIC can be expressed as

      AIC = n log( [math]\displaystyle{ \sigma^2 }[/math]) + 2K,

1974