Nonparametric Model Learning Algorithm: Difference between revisions
Jump to navigation
Jump to search
(Created page with "A nonparametric model learning algorithm is a learning algorithm/statistical modeling algorithm that makes few assumptions a...") |
m (Text replacement - "ions]] " to "ion]]s ") |
||
(26 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
A [[Nonparametric Model Learning Algorithm|nonparametric model learning algorithm]] is a [[learning algorithm]]/[[statistical modeling algorithm]] that makes few assumptions about underlying [[probability distribution]]s | A [[Nonparametric Model Learning Algorithm|nonparametric model learning algorithm]] is a [[model learning algorithm]]/[[statistical modeling algorithm]] that makes few assumptions about underlying [[probability distribution]]s. | ||
* <B>AKA:</B> [[Nonparametric Statistical Procedure]]. | * <B>AKA:</B> [[Nonparametric Model Learning Algorithm|Nonparametric Statistical Procedure]]. | ||
* <B>Context:</B> | * <B>Context:</B> | ||
** [[ | ** It can (often) be implemented by a [[Nonparametric Model Learning System]] to solve a [[Nonparametric Model Learning Task]]. | ||
* <B>Example(s):</B> | * <B>Example(s):</B> | ||
** [[Nonparametric Regression Algorithm]]. | ** [[Nonparametric Regression Algorithm]]. | ||
Line 9: | Line 9: | ||
** [[Principal Components Analysis Algorithm]]. | ** [[Principal Components Analysis Algorithm]]. | ||
** [[Nonparametric Bayes Algorithm]]. | ** [[Nonparametric Bayes Algorithm]]. | ||
** [[FSF Algorithm]] | ** [[FSF Algorithm]]. | ||
** [[EM Algorithm]]. | ** [[EM Algorithm]]. | ||
** … | |||
* <B>Counter-Example(s):</B> | * <B>Counter-Example(s):</B> | ||
** a [[Parametric Model Learning Algorithm]]/[[Parametric Statistical Modeling Algorithm]]. | ** a [[Parametric Model Learning Algorithm]]/[[Parametric Statistical Modeling Algorithm]]. | ||
* <B>See</U>:</B> [[Gaussian Process Algorithm]], [[Distribution-Free Statistic]], [[Distribution Function]]. | * <B>See</U>:</B> [[Gaussian Process Algorithm]], [[Distribution-Free Statistic]], [[Distribution Function]]. | ||
---- | ---- | ||
---- | ---- | ||
==References == | |||
== References == | |||
* http://en.wikipedia.org/wiki/Non-parametric_statistics | * http://en.wikipedia.org/wiki/Non-parametric_statistics | ||
===2009=== | === 2009 === | ||
* ([[2009_StatisticalMachineLearningCourse10-702|Lafferty & Wasserman, 2009]]) | * ([[2009_StatisticalMachineLearningCourse10-702|Lafferty & Wasserman, 2009]]) ⇒ [[John D. Lafferty]], and [[Larry Wasserman]]. ([[2009]]). “[http://www.cs.cmu.edu/~10702/ Statistical Machine Learning - Course: 10-702]." Spring 2009, Carnegie Mellon Institute. | ||
** <B>[[Nonparametric | ** <B>[[Nonparametric Model Learning Algorithm|Nonparametric methods]]</B>: [[Nonparametric Regression]] and [[Density Estimation]], [[Nonparametric Classification]], [[Boosting]], [[Clustering]] and [[Dimension Reduction]], [[PCA]], [[Manifold Method]]s, [[Principal Curves]], [[Spectral Method]]s, [[The Bootstrap]] and [[Subsampling]], [[Nonparametric Bayes]]. | ||
<BR> | <BR> | ||
* ([[Ghahramani, 2009]]) | * ([[Ghahramani, 2009]]) ⇒ [[Zoubin Ghahramani]]. ([[2009]]). http://learning.eng.cam.ac.uk/zoubin/nonparam.html | ||
** QUOTE: [[Non-parametric models]] are very flexible [[statistical models]] in which the complexity of the model grows with the amount of [[observed data]]. While traditional [[parametric models]] make strong assumptions about how the data was generated, [[non-parametric models]] try to make weaker assumptions and let the data "speak for itself". Many [[non-parametric models]] can be seen as infinite limits of [[finite parametric models]], and an important family of [[non-parametric models]] are derived from [[Dirichlet process]]es. See also [[Gaussian Process]]es. | ** QUOTE: [[Non-parametric models]] are very flexible [[statistical models]] in which the complexity of the model grows with the amount of [[observed data]]. While traditional [[parametric models]] make strong assumptions about how the data was generated, [[non-parametric models]] try to make weaker assumptions and let the data "speak for itself". Many [[non-parametric models]] can be seen as infinite limits of [[finite parametric models]], and an important family of [[non-parametric models]] are derived from [[Dirichlet process]]es. See also [[Gaussian Process]]es. | ||
===2004=== | === 2004 === | ||
* ([[2004_BoostedLasso|Zhao & Yu, 2004]]) | * ([[2004_BoostedLasso|Zhao & Yu, 2004]]) ⇒ Peng Zhao, and Bin Yu. ([[2004]]). “[http://www.stat.berkeley.edu/users/binyu/ps/blasso.sub.pdf Boosted Lasso]." Tech Report, Statistics Department, U. C. Berkeley. | ||
** QUOTE: | ** QUOTE: … FSF exists as a compromise since, like Boosting, it is a <B>[[Nonparametric Model Learning Algorithm|nonparametric learning algorithm]]</B> that works with different loss functions and large numbers of base ... | ||
===1999=== | === 1999 === | ||
* ([[1999_NonparametricStatisticalMethods|Hollander & Wolfe]]) | * ([[1999_NonparametricStatisticalMethods|Hollander & Wolfe]]) ⇒ Myles Hollander, Douglas A. Wolfe. ([[1999]]). “[http://books.google.com/books?id=RJAQAQAAIAAJ Nonparametric Statistical Methods, 2nd Edition]." Wiley. ISBN:0471190454 | ||
** QUOTE: Roughly speaking, a <B>[[Nonparametric | ** QUOTE: Roughly speaking, a <B>[[Nonparametric Model Learning Algorithm|nonparametric procedure]]</B> is a [[Statistical Modeling Algorithm|statistical procedure]] that has certain desirable properties that hold under relatively mild assumptions regarding the [[underlying population]]s from which the data are obtained. That rapid and continuous development on <B>[[Nonparametric Model Learning Algorithm|nonparametric statistical procedures]]</B> over that past six decades is due to the following advantages enjoyed by nonparametric techniques: ... | ||
*** The term ''nonparametric'', introduced in Section 1.1, is imprecise. The related term ''distribution-free'' has precise meaning. | *** The term ''nonparametric'', introduced in Section 1.1, is imprecise. The related term ''distribution-free'' has precise meaning. … | ||
===1998=== | === 1998 === | ||
* ([[1998_LocalAdaptiveSubspaceRegression|Vijayakumar & Schaal, 1998]]) | * ([[1998_LocalAdaptiveSubspaceRegression|Vijayakumar & Schaal, 1998]]) ⇒ Sethu Vijayakumar, and Stefan Schaal. ([[1998]]). “[http://www-clmc.usc.edu/publications/V/vijayakumar-NPL1998.pdf Local Adaptive Subspace Regression].” In: Neural Processing Letters, 7(3). [http://dx.doi.org/10.1023/A:1009696221209 doi:10.1023/A:1009696221209] | ||
** QUOTE: | ** QUOTE: … Based on this, [[we]] developed a <B>[[Nonparametric Model Learning Algorithm|nonparametric learning algorithm]]</B> which is targeted to make use of such locally low dimensional distributions. ... | ||
---- | ---- |
Latest revision as of 07:32, 22 August 2024
A nonparametric model learning algorithm is a model learning algorithm/statistical modeling algorithm that makes few assumptions about underlying probability distributions.
- AKA: Nonparametric Statistical Procedure.
- Context:
- It can (often) be implemented by a Nonparametric Model Learning System to solve a Nonparametric Model Learning Task.
- Example(s):
- Counter-Example(s):
- See: Gaussian Process Algorithm, Distribution-Free Statistic, Distribution Function.
References
2009
- (Lafferty & Wasserman, 2009) ⇒ John D. Lafferty, and Larry Wasserman. (2009). “Statistical Machine Learning - Course: 10-702." Spring 2009, Carnegie Mellon Institute.
- (Ghahramani, 2009) ⇒ Zoubin Ghahramani. (2009). http://learning.eng.cam.ac.uk/zoubin/nonparam.html
- QUOTE: Non-parametric models are very flexible statistical models in which the complexity of the model grows with the amount of observed data. While traditional parametric models make strong assumptions about how the data was generated, non-parametric models try to make weaker assumptions and let the data "speak for itself". Many non-parametric models can be seen as infinite limits of finite parametric models, and an important family of non-parametric models are derived from Dirichlet processes. See also Gaussian Processes.
2004
- (Zhao & Yu, 2004) ⇒ Peng Zhao, and Bin Yu. (2004). “Boosted Lasso." Tech Report, Statistics Department, U. C. Berkeley.
- QUOTE: … FSF exists as a compromise since, like Boosting, it is a nonparametric learning algorithm that works with different loss functions and large numbers of base ...
1999
- (Hollander & Wolfe) ⇒ Myles Hollander, Douglas A. Wolfe. (1999). “Nonparametric Statistical Methods, 2nd Edition." Wiley. ISBN:0471190454
- QUOTE: Roughly speaking, a nonparametric procedure is a statistical procedure that has certain desirable properties that hold under relatively mild assumptions regarding the underlying populations from which the data are obtained. That rapid and continuous development on nonparametric statistical procedures over that past six decades is due to the following advantages enjoyed by nonparametric techniques: ...
- The term nonparametric, introduced in Section 1.1, is imprecise. The related term distribution-free has precise meaning. …
- QUOTE: Roughly speaking, a nonparametric procedure is a statistical procedure that has certain desirable properties that hold under relatively mild assumptions regarding the underlying populations from which the data are obtained. That rapid and continuous development on nonparametric statistical procedures over that past six decades is due to the following advantages enjoyed by nonparametric techniques: ...
1998
- (Vijayakumar & Schaal, 1998) ⇒ Sethu Vijayakumar, and Stefan Schaal. (1998). “Local Adaptive Subspace Regression.” In: Neural Processing Letters, 7(3). doi:10.1023/A:1009696221209
- QUOTE: … Based on this, we developed a nonparametric learning algorithm which is targeted to make use of such locally low dimensional distributions. ...