Simple Linear Regression Algorithm

From GM-RKB
(Redirected from simple linear regression)
Jump to navigation Jump to search

A Simple Linear Regression Algorithm is a linear model-based estimation algorithm in which there is only one covariate (predictor variable).



References

2014

  • (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/simple_linear_regression Retrieved:2014-8-3.
    • In statistics, simple linear regression is the least squares estimator of a linear regression model with a single explanatory variable. In other words, simple linear regression fits a straight line through the set of points in such a way that makes the sum of squared residuals of the model (that is, vertical distances between the points of the data set and the fitted line) as small as possible.

      The adjective simple refers to the fact that this regression is one of the simplest in statistics. The slope of the fitted line is equal to the correlation between and corrected by the ratio of standard deviations of these variables. The intercept of the fitted line is such that it passes through the center of mass of the data points.

      Other regression methods besides the simple ordinary least squares (OLS) also exist (see linear regression model). In particular, when one wants to do regression by eye, people usually tend to draw a slightly steeper line, closer to the one produced by the total least squares method. This occurs because it is more natural for one's mind to consider the orthogonal distances from the observations to the regression line, rather than the vertical ones as OLS method does.

2009

  • (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Simple_linear_regression
    • A simple linear regression is a Linear Regression in which there is only one Covariate (predictor variable).
    • Simple linear regression is used to evaluate the linear relationship between two variables. One example could be the relationship between muscle strength and lean body mass. Another way to put it is that simple linear regression is used to develop an equation by which we can predict or estimate a dependent variable given an independent variable.
    • Given a sample [math]\displaystyle{ (Y_i, X_i), \, i = 1, \ldots, n }[/math], the regression model is given by
      • [math]\displaystyle{ Y_i = a + bX_i + \varepsilon_i }[/math]
    • Where [math]\displaystyle{ Y_i }[/math] is the dependent variable, [math]\displaystyle{ a }[/math] is the [math]\displaystyle{ y }[/math] intercept, [math]\displaystyle{ b }[/math] is the gradient or slope of the line, [math]\displaystyle{ X_i }[/math] is independent variable and [math]\displaystyle{ \varepsilon_i }[/math] is a random term associated with each observation.
    • The linear relationship between the two variables (i.e. dependent and independent) can be measured using a correlation coefficient e.g. the Pearson Product Moment Correlation Coefficient.