# Chi-Squared Probability Function (χ2)

(Redirected from chi-squared distribution)

Jump to navigation
Jump to search
A Chi-Squared Probability Function (χ2) is a Gamma probability function from a Chi-squared distribution family (based on a sum of squares of [math]\displaystyle{ k }[/math] independent standard normal random variables).

**AKA:**[math]\displaystyle{ \chi^2 }[/math].**Context:**- It has mode of either [math]\displaystyle{ 0 }[/math] when [math]\displaystyle{ v \leq 2 }[/math] the mode or [math]\displaystyle{ (v - 2) }[/math] otherwise.
- It has mean of [math]\displaystyle{ v }[/math]
- It has variance of [math]\displaystyle{ 2v }[/math]
- Its shape depends on the degrees of freedom with more degrees of freedom resulting is reduced skew.
- It can range from being a Continuous Chi-Square Distribution to being a Discrete Chi-Square Distribution.
- It can correspond to an Exponential Probability Function (when [math]\displaystyle{ v=2 }[/math]).
- It can be referenced by a Chi-Squared Statistic
- It can be used directly or indirectly in many tests of significance that approximate its distribution. (e.g. tests of deviations of differences between theoretically expected and observed frequencies - one-way tables; and the relationship between categorical variables - contingency tables).

**Example(s):**- [math]\displaystyle{ \chi^2(k=2,x=0.5) = \frac{1}{2^{0.5} \times (0.5 - 1)!) \times (0.5^{(0.5 - 1)}) \times (e^{(((-1) * 0.5) / 2)}} = 0.43939129... }[/math]
- [math]\displaystyle{ \chi^2(k=2,x=1) = }[/math] (1 / ((2^0.5) * ((0.5 - 1) !)) * (1.0^(0.5 - 1)) * (e^(((-1) * 1.0) / 2)) = 0.24197072...
- [math]\displaystyle{ \chi^2(k=2,x=2) = }[/math] (1 / ((2^0.5) * ((0.5 - 1) !)) * (2.0^(0.5 - 1)) * (e^(((-1) * 2.0) / 2)) = 0.10377687...
- …

**Counter-Example(s):****See:**Chi-Squared Test.

## References

### 2011

- (Wikipedia, 2011) ⇒ http://en.wikipedia.org/wiki/Chi-square_distribution
- QUOTE:In probability theory and statistics, the
**chi-square distribution**(also**chi-squared**or) with*χ*²-distribution*k*degrees of freedom is the distribution of a sum of the squares of*k*independent standard normal random variables. It is one of the most widely used probability distributions in inferential statistics, e.g., in hypothesis testing or in construction of confidence intervals. When there is a need to contrast it with the noncentral chi-square distribution, this distribution is sometimes called the**central chi-square distribution**. The chi-square distribution is used in the common chi-square tests for goodness of fit of an observed distribution to a theoretical one, the independence of two criteria of classification of qualitative data, and in confidence interval estimation for a population standard deviation of a normal distribution from a sample standard deviation. Many other statistical tests also use this distribution, like Friedman's analysis of variance by ranks. The chi-square distribution is a special case of the gamma distribution.

- QUOTE:In probability theory and statistics, the

### 2008

- (Upton & Cook, 2008) ⇒ Graham Upton, and Ian Cook. (2008). “A Dictionary of Statistics, 2nd edition revised." Oxford University Press. ISBN:0199541450
- QUOTE:Chi-Squared [math]\displaystyle{ (\chi^2) }[/math] Distribution: If [math]\displaystyle{ Z_1, Z,2 ..., Z_v }[/math] and [math]\displaystyle{ v }[/math] are if independent standard normal variables, and if [math]\displaystyle{ Y }[/math] is defined by [math]\displaystyle{ Y = \sum_{j=1}^{v}Z_j^2 }[/math] then [math]\displaystyle{ Y }[/math] has a chi-squared distribution with [math]\displaystyle{ v }[/math] degrees of freedom (written as [math]\displaystyle{ \chi_v^2 }[/math]). The probability density function [math]\displaystyle{ f }[/math] is given by [math]\displaystyle{ f(y) = { e^{-\frac{1}{2}y } y^{(\frac{1}{2}{v-1}) } \over {2^\frac{1}{2}v}\Gamma(\frac{1}{2}v) } }[/math] where [math]\displaystyle{ \Gamma }[/math] is the gamma function. The form of the distribution was first given by Abbe in 1863 and was independently derived by Helmert in 1875 and Karl Pearson in 1900. It was Pearson who gave the distribution its current name. The chi-squared distribution has mean [math]\displaystyle{ v }[/math] and variance [math]\displaystyle{ 2v }[/math]. For [math]\displaystyle{ v \leq 2 }[/math] the mode is at 0; otherwise it is at [math]\displaystyle{ (v - 2) }[/math]. A chi-squared distribution is a special case of a gamma distribution. The case [math]\displaystyle{ (v = 2) }[/math] corresponds to the exponential distribution. Percentage points for chi-squared distributions are given in Appendix X. … All chi-squared distributions have ranges from 0 to [math]\displaystyle{ \infty }[/math]. Their shape is determined by the value of [math]\displaystyle{ v }[/math]. If [math]\displaystyle{ (v \gt 2) }[/math] then the distribution has a mode at [math]\displaystyle{ (v - 2) }[/math]; otherwise the mode is at 0.

### 2001

- (Leung & Malik, 2001) ⇒ Thomas Leung and Jitendra Malik. (2001). “Representing and Recognizing the Visual Appearance of Materials using Three-dimensional Textons.” In: International Journal of Computer Vision, 43(1). doi:10.1023/A:1011126920638
- QUOTE: … The significance for a certain chi-square distance is given by the chi-square probability function: P(χ2 | ν). … 9. Each element in the matrix [math]\displaystyle{ e_{ij} }[/math] is given by the chi-square probability function Figure 9.

### 1999

- (Lane, 1999) ⇒ D. Lane. (1999). “HyperStat Online Textbook". Chapter 16: Chi Square.