# Linear Model Regression System

A Linear Model Regression System is a supervised model-based numeric-value prediction system that implements a linear regression algorithms to can solve a linear regression task.

**AKA:**Linear Regression Software.**Context:**- It can range from being a Simple Linear Regression System to being a Multiple Linear Regression System.
- It can range from being a Least-Squares Linear Regression System, to being Linear Ridge Regression System.

**Example(s):**`sklearn.linear_model`

, a Generalized Linear Model Regression System within Scikit Learn.`statsmodels.api`

, a Linear Model Regression System within Statmodels.`lm`

an Ordinary Linear Model Regression System within R.- An One-predictor-variable linear regression online calculator such as: http://onlineregression.sdsu.edu/onlineregression11.php
- A Multiple linear regression online calculator such as: http://onlineregression.sdsu.edu/onlineregression13.php
- A Generalized Linear Model Regression System.
- A Ordinary Linear Model Regression System.
- A Weighted Linear Model Regression System.
- A Regularized Linear Model Regression System.

**Counter-Example(s):****See:**Least Supervised Classification System, Linear Least-Squares Regression System, Linear Ridge Regression System, A Linear Bayesian Regression System.

## References

### 2017a

- (Scikit Learn, 2017) ⇒ http://scikit-learn.org/stable/modules/classes.html#module-sklearn.linear_model Retrieved: 2017-30-07.
- QUOTE: The
`sklearn.linear_model`

module implements generalized linear models. It includes Ridge regression, Bayesian Regression, Lasso and Elastic Net estimators computed with Least Angle Regression and coordinate descent. It also implements Stochastic Gradient Descent related algorithms.

- QUOTE: The

### 2017b

- (Scikit Learn, 2017) ⇒ http://scikit-learn.org/stable/modules/linear_model.html Retrieved: 2017-30-07.
- QUOTE: The following are a set of methods intended for regression in which the target value is expected to be a linear combination of the input variables. In mathematical notion, if [math]\hat{y}[/math] is the predicted value.
[math]\hat{y}(w, x) = w_0 + w_1 x_1 + \cdots + w_p x_p[/math]

Across the module, we designate the vector [math]w = (w_1,\cdots, w_p)[/math] as

`coef_`

and [math]w_0[/math] as`intercept_`

.

- QUOTE: The following are a set of methods intended for regression in which the target value is expected to be a linear combination of the input variables. In mathematical notion, if [math]\hat{y}[/math] is the predicted value.

### 2017c

- (Redriguez,2017) ⇒ Germán Rodríguez, Princeton University (2017). “Introducing R - 4 Linear Models" http://data.princeton.edu/R/linearModels.html
- QUOTE: To fit an ordinary linear model with fertility change as the response and setting and effort as predictors, try
`lmfit = lm( change ~ setting + effort )`

Note first that

`lm`

is a function, and we assign the result to an object that we choose to call`lmfit`

(for linear model fit). This stores the results of the fit for later examination.The argument to

`lm`

is a model formula, which has the response on the left of the tilde ~ (read "is modeled as") and a Wilkinson-Rogers model specification formula on the right.

- QUOTE: To fit an ordinary linear model with fertility change as the response and setting and effort as predictors, try

### 2014

- (Perktold et al.,2014) ⇒ Josef Perktold, Skipper Seabold and Jonathan Taylor (statsmodels-developers, 2009-2017). “Linear Regression" http://statsmodels.sourceforge.net/stable/regression.html
- QUOTE: Linear models with independently and identically distributed errors, and for errors with heteroscedasticity or autocorrelation. This module allows estimation by ordinary least squares (OLS), weighted least squares (WLS), generalized least squares (GLS), and feasible generalized least squares with autocorrelated AR(p) errors.