657,267
edits
m (Text replacement - " statistician " to " statistician ") |
m (Text replacement - "he mean squared error" to "he mean squared error") |
||
Line 139: | Line 139: | ||
Comparisons involving biased estimators are often based on the <B>mean squared error (MSE)</B> defined to be | Comparisons involving biased estimators are often based on the <B>mean squared error (MSE)</B> defined to be | ||
:<math>E[(T-\theta)^2] = Var(T) + \{E(T) - \theta\}^2 = Var(T) + b^2,</math> | :<math>E[(T-\theta)^2] = Var(T) + \{E(T) - \theta\}^2 = Var(T) + b^2,</math> | ||
where <math>E(T)</math> and <math>Var(T)</math> are, respectively, the expectation and variance of <math>T</math>. The <B>root mean square error (RMSE)</B> is the square root of the mean squared error and has the same tuiits as the original data. | where <math>E(T)</math> and <math>Var(T)</math> are, respectively, the expectation and variance of <math>T</math>. The <B>root mean square error (RMSE)</B> is the square root of the [[mean squared error]] and has the same tuiits as the original data. | ||
An estimator is said to be a <B>consistent estimator</B> if, for all positive <math>c</math>, | An estimator is said to be a <B>consistent estimator</B> if, for all positive <math>c</math>, |