Wald Test

From GM-RKB
(Redirected from Wald Statistic)
Jump to navigation Jump to search

A Wald Test is a parametric statistic that ...



References

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Wald_test Retrieved:2015-11-22.
    • The Wald test is a parametric statistical test named after the Hungarian statistician Abraham Wald. Whenever a relationship within or between data items can be expressed as a statistical model with parameters to be estimated from a sample, the Wald test can be used to test the true value of the parameter based on the sample estimate.

      Suppose an economist, who has data on social class and shoe size, wonders whether social class is associated with shoe size. Say [math]\displaystyle{ \theta }[/math] is the average increase in shoe size for upper-class people compared to middle-class people: then the Wald test can be used to test whether [math]\displaystyle{ \theta }[/math] is 0 (in which case social class has no association with shoe size) or non-zero (shoe size varies between social classes). Here, [math]\displaystyle{ \theta }[/math], the hypothetical difference in shoe sizes between upper and middle-class people in the whole population, is a parameter. An estimate of [math]\displaystyle{ \theta }[/math] might be the difference in shoe size between upper and middle-class people in the sample. In the Wald test, the economist uses the estimate and an estimate of variability (see below) to draw conclusions about the unobserved true [math]\displaystyle{ \theta }[/math] . Or, for a medical example, suppose smoking multiplies the risk of lung cancer by some number R: then the Wald test can be used to test whether R = 1 (i.e. there is no effect of smoking) or is greater (or less) than 1 (i.e. smoking alters risk).

      A Wald test can be used in a great variety of different models including models for dichotomous variables and models for continuous variables.

2008

  • (Upton & Cook, 2008).
    • QUOTE: ... Wald statistic: Any *statistic of the form W = {g(T)}’1Y‘g(T). where T is an *estimator of a *vector parameter 0, g is some vector-valued function, and Dis an estimator of the variance-covariance matrix of the e vector {g(T) — g(0)}. The statistics are used to test the null hypothesis that g(0) = 0, where 0 is a vector with all entries equal to 0. If T is the maximum likelihood estimator then Whas an approximate *chi-squared distribution with p *degrees of freedom (where p is the number of elements in 0). ...