# Overfitted Predictive Model

An Overfitted Predictive Model is a predictive function that performs significantly better on training records than on test cases (in the same sampling space).

**Counter-Example(s):****See:**Bias and Variance, Minimum Description Length, Minimum Message Length, Pruning, Model Regularization.

## References

### 2012

- http://en.wikipedia.org/wiki/Overfitting
- QUOTE:In statistics,
**overfitting**^{[clarification needed]}occurs when a statistical model describes random error or noise instead of the underlying relationship. Overfitting generally occurs when a model is excessively complex, such as having too many parameters relative to the number of observations. A model which has been overfit will generally have poor predictive performance, as it can exaggerate minor fluctuations in the data.The possibility of overfitting exists because the criterion used for training the model is not the same as the criterion used to judge the efficacy of a model. In particular, a model is typically trained by maximizing its performance on some set of training data. However, its efficacy is determined not by its performance on the training data but by its ability to perform well on unseen data. Overfitting occurs when a model begins to memorize training data rather than learning to generalize from trend. As an extreme example, if the number of parameters is the same as or greater than the number of observations, a simple model can learn to perfectly predict the training data simply by memorizing the training data in its entirety. Such a model will typically fail drastically on unseen data, as it has not learned to generalize at all.

The potential for overfitting depends not only on the number of parameters and data but also the conformability of the model structure with the data shape, and the magnitude of model error compared to the expected level of noise or error in the data.

Even when the fitted model does not have an excessive number of parameters, it is to be expected that the fitted relationship will appear to perform less well on a new data set than on the data set used for fitting.

^{[1]}In particular, the value of the coefficient of determination will shrink relative to the original training data.In order to avoid overfitting, it is necessary to use additional techniques (e.g. cross-validation, regularization, early stopping, pruning, Bayesian priors on parameters or model comparison), that can indicate when further training is not resulting in better generalization. The basis of some techniques is either (1) to explicitly penalize overly complex models, or (2) to test the model's ability to generalize by evaluating its performance on a set of data not used for training, which is assumed to approximate the typical unseen data that a model will encounter.

- QUOTE:In statistics,

- ↑ Everitt B.S. (2002) Cambridge Dictionary of Statistics, CUP. ISBN 0-521-81099-x (entry for "Shrinkage")

### 2011

- (Webb, 2011i) ⇒ Geoffrey I. Webb. (2011). “Overfitting.” In: (Sammut & Webb, 2011) p.744