Degrees of Freedom (Statistics)

Jump to: navigation, search

A Degrees of Freedom is number of independent variables in sampling distribution minus its constraints.



  • (Wikipedia, 2016) ⇒
    • In statistics, the number of degrees of freedom is the number of values in the final calculation of a statistic that are free to vary.[1]

      The number of independent ways by which a dynamic system can move, without violating any constraint imposed on it, is called number of degrees of freedom. In other words, the number of degrees of freedom can be defined as the minimum number of independent coordinates that can specify the position of the system completely.

      Estimates of statistical parameters can be based upon different amounts of information or data. The number of independent pieces of information that go into the estimate of a parameter are called the degrees of freedom. In general, the degrees of freedom of an estimate of a parameter are equal to the number of independent scores that go into the estimate minus the number of parameters used as intermediate steps in the estimation of the parameter itself (e.g. the sample variance has N − 1 degrees of freedom, since it is computed from N random scores minus the only 1 parameter estimated as intermediate step, which is the sample mean).[2]

      Mathematically, degrees of freedom is the number of dimensions of the domain of a random vector, or essentially the number of "free" components (how many components need to be known before the vector is fully determined).

      The term is most often used in the context of linear models (linear regression, analysis of variance), where certain random vectors are constrained to lie in linear subspaces, and the number of degrees of freedom is the dimension of the subspace. The degrees of freedom are also commonly associated with the squared lengths (or "sum of squares" of the coordinates) of such vectors, and the parameters of chi-squared and other distributions that arise in associated statistical testing problems.

      While introductory textbooks may introduce degrees of freedom as distribution parameters or through hypothesis testing, it is the underlying geometry that defines degrees of freedom, and is critical to a proper understanding of the concept. Walker (1940)[3] has stated this succinctly as "the number of observations minus the number of necessary relations among these observations."


For example, the exact shape of a t distribution is determined by its degrees of freedom. When the t distribution is used to compute a confidence interval for a mean score, one population parameter (the mean) is estimated from sample data. Therefore, the number of degrees of freedom is equal to the sample size minus one.


  • (Britannica, 2017) ⇒ Retrieved on 2017-03-07 from
    • Degree of freedom, in mathematics, any of the number of independent quantities necessary to express the values of all the variable properties of a system. A system composed of a point moving without constraints in space, for example, has three degrees of freedom because three coordinates are needed to determine the position of the point.

      The number of degrees of freedom is reduced by constraints such as the requirement that a point move along a particular path. Thus, a simple pendulum has only one degree of freedom because its angle of inclination is specified by a single number. In a chemical system, the condition of equilibrium imposes constraints: properties such as temperature and composition of coexisting phases cannot all vary independently (see phase rule).

      If, in a statistical sample distribution, there are n variables and m constraints on the distribution, there are n − m degrees of freedom.

  1. "Degrees of Freedom". "Glossary of Statistical Terms". Animated Software. Retrieved 2008-08-21. 
  2. Lane, David M.. "Degrees of Freedom". HyperStat Online. Statistics Solutions. Retrieved 2008-08-21. 
  3. Walker, H. M. (April 1940). "Degrees of Freedom". Journal of Educational Psychology 31 (4): 253–269. doi:10.1037/h0054588.