# Linear Least-Squares Regression Task

A Linear Least-Squares Regression Task is a linear regression task whose objective function is a sum-of-squares function.

**Context:**- Task Input:
a N-observed Numerically-Labeled Training Dataset [math]D=\{(x_1,y_1),(x_2,y_2),\cdots(x_n,y_n)\}[/math] that can be represented by

- [math]\mathbf{Y}[/math] response variable continuous dataset
- [math]\mathbf{X}[/math] predictor variables continuous dataset.

**output**:- [math]\boldsymbol{\beta}=\{\beta_0,\beta_1,...,\beta_p\}[/math], estimated linear model parameters vector, a continuous dataset.
- [math]\mathbf{\hat{Y}}=f(x_i,\hat{\beta_j})[/math], predicted values ( the Fitted Linear Function), a continuous dataset.
- [math]\sigma_x,\sigma_y,\rho_{X,Y}...[/math], standard deviations, correlation coefficient, standard error of estimate and other statistical information the fitting parameters.

**Task Requirements**- It requires to minimize the objective function [math]E(f)=\sum _{i=1}^{n}|y_{i} -f(x_i,\boldsymbol\beta)|^2[/math] where [math]f(x_i,\boldsymbol\beta)[/math] is a linear regression function.
- A regression diagnostic test to determine goodness of fit the regression model and the statistical significance of the estimated parameters

- It can be solved by Linear Least-Squares Regression System that implements a Linear Least-Squares Regression Algorithm.
- It can range from being a Ordinary Least Squares Regression Task to being a Regularized Linear Least-Squares Regression Task.

- Task Input:
**Example(s):**- For the linear regression task represented by the equation : [math]y_i=\beta_0+\beta_1x_i+\beta_2x_i+\cdots+\beta_px_i+\varepsilon_i[/math], the linear least-squares regression task can be solved by
[math]\underset{\boldsymbol{\beta}}{\text{minimize}} \sum_{i=1}^n \sum_{j=1}^p \parallel y_i - x_{ij}\beta_j\parallel^2[/math] with [math]x_{i0}=0[/math] and [math]x_{ij}=x_i[/math] for [math]j\gt 0[/math].

- For the linear regression task represented in the matrix form: [math]\mathbf{Y}=\mathbf{X}\boldsymbol{\beta} + \mathbf{U}[/math], the linear least-squares regression task can be solved by
[math]\underset{\boldsymbol{\beta}}{\text{minimize}} \parallel \mathbf{X}\boldsymbol{\beta} - \mathbf{Y} \parallel^2 [/math]

- For the linear regression task represented by the equation : [math]y_i=\beta_0+\beta_1x_i+\beta_2x_i+\cdots+\beta_px_i+\varepsilon_i[/math], the linear least-squares regression task can be solved by
**Counter-Example(s):****See:**Curve Fitting, System of Linear Equations.

## References

### 2017a

- (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/Linear_least_squares_(mathematics) Retrieved:2017-8-27.
- In statistics and mathematics,
**linear least squares**is an approach fitting a mathematical or statistical model to data in cases where the idealized value provided by the model for any data point is expressed linearly in terms of the unknown parameters of the model. The resulting fitted model can be used to summarize the data, to predict unobserved values from the same system, and to understand the mechanisms that may underlie the system.Mathematically, linear least squares is the problem of approximately solving an overdetermined system of linear equations, where the best approximation is defined as that which minimizes the sum of squared differences between the data values and their corresponding modeled values. The approach is called

*linear*least squares since the assumed function is linear in the parameters to be estimated. Linear least squares problems are convex and have a closed-form solution that is unique, provided that the number of data points used for fitting equals or exceeds the number of unknown parameters, except in special degenerate situations. In contrast, non-linear least squares problems generally must be solved by an iterative procedure, and the problems can be non-convex with multiple optima for the objective function. If prior distributions are available, then even an underdetermined system can be solved using the Bayesian MMSE estimator.In statistics, linear least squares problems correspond to a particularly important type of statistical model called linear regression which arises as a particular form of regression analysis. One basic form of such a model is an ordinary least squares model. The present article concentrates on the mathematical aspects of linear least squares problems, with discussion of the formulation and interpretation of statistical regression models and statistical inferences related to these being dealt with in the articles just mentioned. See outline of regression analysis for an outline of the topic.

- In statistics and mathematics,

### 2017b

- (Fong & Saunders, ) ⇒ David Fong & Michael Saunders. "LSMR: Sparse Equations and Least Squares" http://web.stanford.edu/group/SOL/software/lsmr/ Retrieved: 2017-08-27
- QUOTE: Implementation of a conjugate-gradient type method for solving sparse linear equations and sparse least-squares problems: [math]\begin{align*} \text{Solve } & Ax=b \\ \text{or minimize } & \|Ax-b\|^2 \\ \text{or minimize } & \|Ax-b\|^2 + \lambda^2 \|x\|^2 \end{align*}[/math]
where the matrix [math]A[/math] may be square or rectangular (over-determined or under-determined), and may have any rank. It is represented by a routine for computing [math]Av[/math] and [math]A^T u[/math] for given vectors [math]v[/math] and [math]u[/math].

The scalar [math]\lambda[/math] is a damping parameter. If [math]\lambda \gt 0[/math], the solution is “regularized” in the sense that a unique solution always exists, and [math]\|x\|[/math] is bounded.

The method is based on the Golub-Kahan bidiagonalization process. It is algebraically equivalent to applying MINRES to the normal equation [math] (A^T A + \lambda^2 I) x = A^T b, [/math] but has better numerical properties, especially if [math]A[/math] is ill-conditioned.

- QUOTE: Implementation of a conjugate-gradient type method for solving sparse linear equations and sparse least-squares problems: [math]\begin{align*} \text{Solve } & Ax=b \\ \text{or minimize } & \|Ax-b\|^2 \\ \text{or minimize } & \|Ax-b\|^2 + \lambda^2 \|x\|^2 \end{align*}[/math]