Nonparametric Learning Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - "]). "" to "]). “")
 
Line 1: Line 1:
A [[Nonparametric Learning Algorithm|nonparametric learning algorithm]] is a [[learning algorithm]]/[[statistical modeling algorithm]] that makes few assumptions about underlying [[probability distribution]]s (and can solve a [[nonparametric statistical modeling task]])
#REDIRECT [[Nonparametric Model Learning Algorithm]]
* <B>AKA:</B> [[Nonparametric Statistical Procedure]].
* <B>Context:</B>
** [[Output]]: a [[Nonparametric Statistical Model]]).
* <B>Example(s):</B>
** [[Nonparametric Regression Algorithm]].
** [[Nonparametric Classification Algorithm]].
** [[Boosting Algorithm]].
** [[Principal Components Analysis Algorithm]].
** [[Nonparametric Bayes Algorithm]].
** [[FSF Algorithm]]
** [[EM Algorithm]].
* <B>Counter-Example(s):</B>
** a [[Parametric Statistical Modeling Algorithm]].
* <B>See</U>:</B> [[Gaussian Process Algorithm]], [[Distribution-Free Statistic]], [[Distribution Function]].
----
----
==References ==
* http://en.wikipedia.org/wiki/Non-parametric_statistics
 
===2009===
* ([[2009_StatisticalMachineLearningCourse10-702|Lafferty & Wasserman, 2009]]) &rArr; [[John D. Lafferty]], and [[Larry Wasserman]]. ([[2009]]). “[http://www.cs.cmu.edu/~10702/ Statistical Machine Learning - Course: 10-702]." Spring 2009, Carnegie Mellon Institute.
** <B>[[Nonparametric Statistical Modeling Algorithm|Nonparametric methods]]</B>: [[Nonparametric Regression]] and [[Density Estimation]], [[Nonparametric Classification]], [[Boosting]], [[Clustering]] and [[Dimension Reduction]], [[PCA]], [[Manifold Method]]s, [[Principal Curves]], [[Spectral Method]]s, [[The Bootstrap]] and [[Subsampling]], [[Nonparametric Bayes]].
<BR>
* (Ghahramani, 2009) &rArr; [[Zoubin Ghahramani]]. ([[2009]]). http://learning.eng.cam.ac.uk/zoubin/nonparam.html
** [[Non-parametric models]] are very flexible [[statistical models]] in which the complexity of the model grows with the amount of [[observed data]]. While traditional [[parametric models]] make strong assumptions about how the data was generated, [[non-parametric models]] try to make weaker assumptions and let the data "speak for itself". Many [[non-parametric models]] can be seen as infinite limits of [[finite parametric models]], and an important family of [[non-parametric models]] are derived from [[Dirichlet process]]es. See also [[Gaussian Process]]es.
 
===2004===
* ([[2004_BoostedLasso|Zhao & Yu, 2004]]) &rArr; Peng Zhao, and Bin Yu. (2004). "[http://www.stat.berkeley.edu/users/binyu/ps/blasso.sub.pdf Boosted Lasso]." Tech Report, Statistics Department, U. C. Berkeley.
** ... FSF exists as a compromise since, like Boosting, it is a <B>[[Nonparametric Learning Algorithm|nonparametric learning algorithm]]</B> that works with different loss functions and large numbers of base ...
 
===1999===
* ([[1999_NonparametricStatisticalMethods|Hollander & Wolfe]]) &rArr; Myles Hollander, Douglas A. Wolfe. (1999). "[http://books.google.com/books?id=RJAQAQAAIAAJ Nonparametric Statistical Methods, 2nd Edition]." Wiley. ISBN:0471190454
** Roughly speaking, a <B>[[Nonparametric Statistical Modeling Algorithm|nonparametric procedure]]</B> is a [[Statistical Modeling Algorithm|statistical procedure]] that has certain desirable properties that hold under relatively mild assumptions regarding the [[underlying populations]] from which the data are obtained. That rapid and continuous development on <B>[[Nonparametric Statistical Modeling Algorithm|nonparametric statistical procedures]]</B> over that past six decades is due to the following advantages enjoyed by nonparametric techniques: ...
* The term ''nonparametric'', introduced in Section 1.1, is imprecise. The related term ''distribution-free'' has precise meaning. ...
 
===1998===
* ([[1998_LocalAdaptiveSubspaceRegression|Vijayakumar & Schaal, 1998]]) &rArr; Sethu Vijayakumar, and Stefan Schaal. (1998). "[http://www-clmc.usc.edu/publications/V/vijayakumar-NPL1998.pdf Local Adaptive Subspace Regression]." In: Neural Processing Letters, 7(3). [http://dx.doi.org/10.1023/A:1009696221209 doi:10.1023/A:1009696221209]
** ... Based on this, [[we]] developed a <B>[[Nonparametric Learning Algorithm|nonparametric learning algorithm]]</B> which is targeted to make use of such locally low dimensional distributions. ...
 
----
 
__NOTOC__
[[Category:Concept]]

Latest revision as of 23:33, 5 February 2017