# Autoregressive Model

An Autoregressive Model is a representation of a random process (e.g. time series) which outcome variable is linear function on its previous values.

**AKA:**AR.**See:**Autoregression Algorithm, Random Process, ARMA, ARIMA, Stationary Process, Timeseries Modeling, Vector Autoregression.

## References

### 2016

- (The Pennsylvania State University, 2016) ⇒ The Pennsylvania State University (2016). “Online Course Statistics 501" https://onlinecourses.science.psu.edu/stat501/node/358
- A time series is a sequence of measurements of the same variable(s) made over time. Usually the measurements are made at evenly spaced times - for example, monthly or yearly. Let us first consider the problem in which we have a y-variable measured as a time series. As an example, we might have y a measure of global temperature, with measurements observed each year. To emphasize that we have measured values over time, we use "t" as a subscript rather than the usual "i," i.e., [math]y_t[/math] means [math]y[/math] measured in time period t. An
**autoregressive model**is when a value from a time series is regressed on previous values from that same time series. for example, [math]y_t[/math] on [math]y_{t−1}[/math]:

- A time series is a sequence of measurements of the same variable(s) made over time. Usually the measurements are made at evenly spaced times - for example, monthly or yearly. Let us first consider the problem in which we have a y-variable measured as a time series. As an example, we might have y a measure of global temperature, with measurements observed each year. To emphasize that we have measured values over time, we use "t" as a subscript rather than the usual "i," i.e., [math]y_t[/math] means [math]y[/math] measured in time period t. An

[math]y_t=\beta_0+\beta_1y_t−1+\varepsilon_t [/math]

- In this regression model, the response variable in the previous time period has become the predictor and the errors have our usual assumptions about errors in a simple linear regression model. The order of an autoregression is the number of immediately preceding values in the series that are used to predict the value at the present time. So, the preceding model is a first-order autoregression, written as AR(1).
If we want to predict [math]y[/math] this year ([math]y_t[/math]) using measurements of global temperature in the previous two years ([math]y_{t−1},y_{t−2}[/math]), then the autoregressive model for doing so would be:

- In this regression model, the response variable in the previous time period has become the predictor and the errors have our usual assumptions about errors in a simple linear regression model. The order of an autoregression is the number of immediately preceding values in the series that are used to predict the value at the present time. So, the preceding model is a first-order autoregression, written as AR(1).
- [math]y_t=\beta_0+\beta_1y_{t−1}+\beta_2y_{t−2}+\varepsilon_t.[/math]
- This model is a second-order autoregression, written as AR(2), since the value at time tt is predicted from the values at times [math]t−1[/math] and [math]t−2[/math]. More generally, a kth-order autoregression, written as AR(k), is a multiple linear regression in which the value of the series at any time [math]t[/math] is a (linear) function of the values at times [math]t−1,t−2,…,t−k[/math].

- (Wikipedia, 2016) ⇒ https://www.wikiwand.com/en/Autoregressive_model Retrieved 2016-07-10
- In statistics and signal processing, an
**autoregressive**(**AR**)**model**is a representation of a type of random process; as such, it describes certain time-varying processes in nature, economics, etc. The autoregressive model specifies that the output variable depends linearly on its own previous values and on a stochastic term (an imperfectly predictable term); thus the model is in the form of a stochastic difference equation.Together with the Moving-Average (MA) model, it is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure; it is also a special case of the vector autoregressive model (VAR), which consists of a system of more than one stochastic difference equation.

Contrary to the MA model, the AR model is not always stationary as it may contain a unit root.

- In statistics and signal processing, an

**Definition**

- The notation [math]AR(p)[/math] indicates an autoregressive model of order
*p*. The AR(*p*) model is defined as

- [math] X_t = c + \sum_{i=1}^p \varphi_i X_{t-i}+ \varepsilon_t \,[/math]
- where [math]\varphi_1, \ldots, \varphi_p[/math] are the
*parameters*of the model, [math]c[/math] is a constant, and [math]\varepsilon_t[/math] is white noise. This can be equivalently written using the backshift operator*B*as

- where [math]\varphi_1, \ldots, \varphi_p[/math] are the

- [math] X_t = c + \sum_{i=1}^p \varphi_i B^i X_t + \varepsilon_t [/math]
- so that, moving the summation term to the left side and using polynomial notation, we have

- [math]\phi (B)X_t= c + \varepsilon_t \, .[/math]
- An autoregressive model can thus be viewed as the output of an all-pole infinite impulse response filter whose input is white noise.
Some parameter constraints are necessary for the model to remain wide-sense stationary. For example, processes in the AR(1) model with [math]|\varphi_1 | \geq 1[/math] are not stationary. More generally, for an AR(

*p*) model to be wide-sense stationary, the roots of the polynomial [math]\textstyle z^p - \sum_{i=1}^p \varphi_i z^{p-i}[/math] must lie within the unit circle, i.e., each root [math]z_i[/math] must satisfy [math]|z_i|\lt 1[/math]

- An autoregressive model can thus be viewed as the output of an all-pole infinite impulse response filter whose input is white noise.