Minimize Chi-Square Estimation Algorithm

From GM-RKB
Jump to navigation Jump to search

A Minimize Chi-Square Estimation Algorithm is an point estimation method that finds the values of parameters which make the chi-square test statistic as small as possible.



References

2018

  • (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Minimum_chi-square_estimation Retrieved:2018-3-7.
    • In statistics, minimum chi-square estimation is a method of estimation of unobserved quantities based on observed data.

      In certain chi-square tests, one rejects a null hypothesis about a population distribution if a specified test statistic is too large, when that statistic would have approximately a chi-square distribution if the null hypothesis is true. In minimum chi-square estimation, one finds the values of parameters that make that test statistic as small as possible.

      Among the consequences of its use is that the test statistic actually does have approximately a chi-square distribution when the sample size is large. Generally, one reduces by 1 the number of degrees of freedom for each parameter estimated by this method.

2017

  • Veselina Kalinova. (2017). “Lecture notes on Regression: Markov Chain Monte Carlo (MCMC)."
    • QUOTE: Here the optimum model is the satisfactory fit with several degrees of freedom, and corresponds to the minimization of the function (see Fig.2, left). It is often used in astronomy when we do not have realistic estimations of the data uncertainties.
      [math]\displaystyle{ \chi^2 = \Sigma^N_{i=1} \frac{(O_i - E_i)^2}{E^2_i} }[/math], (8)
      where O_i - observed value, and E_i - expected value. If we have the [math]\displaystyle{ \chi^2 }[/math] for two parameters, the best fit of the model can be represented as a contour plot (see Fig.2, right):