# Root Mean Squared Error (RMSE) Statistic

A Root Mean Squared Error (RMSE) Statistic is an absolute magnitude scale-dependent dispersion statistic based on the root of the average of the estimator's squared errors (of a means squared error).

**AKA:**RMSD, Root Mean Square Deviation.**Context:**- It can be a Numeric Prediction Performance Measure.
- It can be derived as the squareroot of the Mean Squared Error Metric.
- …

**Example(s):****Counter-Example(s):**- Coefficient of Variation (CV).
- an R-Squared Metric.
- a Scale-Independent one, such as: Mean Absolute Percentage Error (MAPE).
- a Classification Error Metric, such as AUC or F1.

**See:**Forecasting Measure.

## References

### 2013

- (Wikipedia, 2013) ⇒ http://en.wikipedia.org/wiki/Root_mean_square_deviation
- The
**root-mean-square deviation (RMSD)**or root-mean-square error (RMSE) is a frequently used measure of the difference s between values predicted by a model or an estimator and the values actually observed. These individual differences are called residuals when the calculations are performed over the data sample that was used for estimation, and are called*prediction errors*when computed out-of-sample. The RMSD serves to aggregate the magnitudes of the errors in predictions for various times into a single measure of predictive power. RMSD is a good measure of accuracy, but only to compare forecasting errors of different models for a particular variable and not between variables, as it is scale-dependent.^{[1]}

- The

- ↑ Hyndman, Rob J. Koehler, Anne B. (2006). "Another look at measures of forecast accuracy".
*International Journal of Forecasting*: 679–688. http://dx.doi.org/10.1016/j.ijforecast.2006.03.001.

### 2013

- http://en.wikipedia.org/wiki/Root-mean-square_deviation#Formula
- The RMSD of an estimator [math]\displaystyle{ \hat{\theta} }[/math] with respect to an estimated parameter [math]\displaystyle{ \theta }[/math] is defined as the square root of the mean square error: :[math]\displaystyle{ \operatorname{RMSD}(\hat{\theta}) = \sqrt{\operatorname{MSE}(\hat{\theta})} = \sqrt{\operatorname{E}((\hat{\theta}-\theta)^2)}. }[/math] For an unbiased estimator, the RMSD is the square root of the variance, known as the standard error.
The RMSD of predicted values [math]\displaystyle{ \hat y_t }[/math] for times

*t*of a regression's dependent variable [math]\displaystyle{ y }[/math] is computed for*n*different predictions as the square root of the mean of the squares of the deviations: :[math]\displaystyle{ \operatorname{RMSD}=\sqrt{\frac{\sum_{t=1}^n (y_t - \hat y_t)^2}{n}}. }[/math] In some disciplines, the RMSD is used to compare differences between two things that may vary, neither of which is accepted as the "standard". For example, when measuring the average difference between two time series [math]\displaystyle{ x_{1,t} }[/math] and [math]\displaystyle{ x_{2,t} }[/math],

- The RMSD of an estimator [math]\displaystyle{ \hat{\theta} }[/math] with respect to an estimated parameter [math]\displaystyle{ \theta }[/math] is defined as the square root of the mean square error: :[math]\displaystyle{ \operatorname{RMSD}(\hat{\theta}) = \sqrt{\operatorname{MSE}(\hat{\theta})} = \sqrt{\operatorname{E}((\hat{\theta}-\theta)^2)}. }[/math] For an unbiased estimator, the RMSD is the square root of the variance, known as the standard error.

the formula becomes :[math]\displaystyle{ \operatorname{RMSD}= \sqrt{\frac{\sum_{t=1}^n (x_{1,t} - x_{2,t})^2}{n}}. }[/math]