Supervised Model-based Estimation Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - ". " to ". ")
m (Text replacement - "QUOTE: " to "QUOTE: ")
 
Line 20: Line 20:
=== 2006 ===
=== 2006 ===
* ([[1996_RegressionShrinkageAndSelViaLasso|Tibshirani, 1996]]) ⇒ [[Robert Tibshirani]]. ([[1996]]). “[http://people.ee.duke.edu/~lcarin/Lasso.pdf Regression Shrinkage and Selection via the Lasso].” In: [[Journal of the Royal Statistical Society]], Series B, 58(1).
* ([[1996_RegressionShrinkageAndSelViaLasso|Tibshirani, 1996]]) ⇒ [[Robert Tibshirani]]. ([[1996]]). “[http://people.ee.duke.edu/~lcarin/Lasso.pdf Regression Shrinkage and Selection via the Lasso].” In: [[Journal of the Royal Statistical Society]], Series B, 58(1).
** QUOTE: Consider the usual [[Supervised Regression Task|regression situation]]: we have [[data]] <math>(\mathbf{x}^i, y^i), i=1,2,...,N \ ,</math> where <math>\mathbf{x}^i=(x_{i1},..., x_{ip})^T</math> and <math>y_i</math> are the [[Regressor Variable|regressors]] and [[Response Variable|response]] for the ''i''th [[Observation record|observation]]. The [[Ordinary Least Squares Estimate|ordinary least squares (OLS) estimates]] are obtained by minimizing the [[Residual Squared Error|residual squared error]].
** QUOTE: Consider the usual [[Supervised Regression Task|regression situation]]: we have [[data]] <math>(\mathbf{x}^i, y^i), i=1,2,...,N \ ,</math> where <math>\mathbf{x}^i=(x_{i1},..., x_{ip})^T</math> and <math>y_i</math> are the [[Regressor Variable|regressors]] and [[Response Variable|response]] for the ''i''th [[Observation record|observation]]. The [[Ordinary Least Squares Estimate|ordinary least squares (OLS) estimates]] are obtained by minimizing the [[Residual Squared Error|residual squared error]].


----
----

Latest revision as of 20:46, 29 December 2022

A Supervised Model-based Estimation Algorithm is a supervised estimation algorithm that is a model-based learning algorithm.



References

2006