Generative Classification Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
No edit summary
 
m (Text replacement - "]]↵*" to "]]. *")
 
(23 intermediate revisions by 3 users not shown)
Line 1: Line 1:
A [[Generative Classification Algorithm]] is a [[generative learning algorithm]] that can solve a [[Supervised Classification Task]].
A [[Generative Classification Algorithm]] is a [[generative learning algorithm]] that can solve a [[supervised classification task]].
* <B><U>AKA</U>:</B> [[Generative Classifier]].
* <B>Context</U>:</B>
* <B><U>Context</U>:</B>
** It can range from being a [[Fully-Supervised Generative Classification Algorithm]] to being a [[Semi-Supervised Generative Classification Algorithm]].
** It can be applied to a [[Classification Task]].
** It can be:
*** A [[Fully-Supervised Generative Classification Algorithm]].
*** a [[Semi-Supervised Generative Classification Algorithm]].
** It can estimate a [[Class-Conditional Density]].
** It can estimate a [[Class-Conditional Density]].
** It can use a [[Parametric Model]]
** It can use a [[Parametric Model]].
* <B><U>See</U>:</B> [[Discriminative Learning Algorithm]].
* <B>Example(s):</B>
** a [[Naive-Bayes Classification Algorithm]].
** …
* <B>Counter-Example(s):</B>.
** a [[Discriminative Learning Algorithm]].
* <B>See:</B> [[Generative Classification Function]], [[Generative Adversarial Network]].
 
----
----
----
----
==References ==


==2004 ==
== References ==
* ([[2004_TheTradeOffBetweenGenAndDiscrClassifiers|Bouchard & Triggs, 2004]]) &rArr; Guillaume Bouchard, and Bill Triggs. (2004). "[http://lear.inrialpes.fr/pubs/2004/BT04/Bouchard-compstat04.pdf The Trade-off Between Generative and Discriminative Classifiers]." In: Proceedings of COMPSTAT 2004.
 
** <U>QUOTE</U>: In supervised classification, inputs <math>x</math> and their labels <math>y</math> arise from an unknown joint probability ''p''(''x'',''y''). If we can approximate ''p''(''x'',''y'') using a parametric family of models <math>G</math> = {''p''<sub>θ</sub>(''x'',''y''),''θ''∈Θ}, then a natural classifier is obtained by first estimating the class-conditional densities, then classifying each new data point to the class with highest posterior probability. This approach is called ''generative'' classification. However, if the overall goal is to find the classification rule with the smallest error rate, this depends only on the conditional density <math>p(y \vert x)</math>. "</i>Discriminative</i>" methods directly model the conditional distribution, without assuming anything about the input distribution p(x).  <P>    Well known [[generative-discriminative pair]]s include [[Linear Discriminant Analysis (LDA)]] vs. [[Linear logistic regression]] and [[naive Bayes]] vs. [[Generalized Additive Models (GAM)]]. Many authors have already studied these models e.g. [5,6]. Under the assumption that the underlying [[distributions are Gaussian]] with equal [[covariances]], it is known that [[LDA]] requires less data than [[its discriminative counterpart]], [[linear logistic regression]] [3]. More generally, it is known that generative classifiers have a smaller variance than. Conversely, the generative approach converges to the best model for the joint distribution ''p''(''x'',''y'') but the resulting conditional density is usually a biased classifier unless its ''p''<sub>θ</sub>(''x'') part is an accurate model for ''p''(''x''). In real world problems the assumed generative model is rarely exact, and asymptotically, a [[discriminative classifier]] should typically be preferred [9, 5]. The key argument is that the discriminative estimator converges to the conditional density that minimizes the negative log-likelihood classification loss against the true density p(x, y) [2]. For finite sample sizes, there is a bias-variance tradeoff and it is less obvious how to choose between generative and [[discriminative classifier]]s'''.
== 2004 ==
* ([[2004_TheTradeOffBetweenGenAndDiscrClassifiers|Bouchard & Triggs, 2004]]) ⇒ [[Guillaume Bouchard]], and Bill Triggs. ([[2004]]). [http://lear.inrialpes.fr/pubs/2004/BT04/Bouchard-compstat04.pdf The Trade-off Between Generative and Discriminative Classifiers].In: Proceedings of COMPSTAT 2004.
** QUOTE: In [[supervised classification]], inputs <math>x</math> and their labels <math>y</math> arise from an unknown [[joint probability]] <math>p(x,y)</math>. If we can approximate <math>p(x,y)</math> using a [[parametric family of models]] <math>G = \{p_θ(x,y),\theta \in \Theta\}</math>, then a natural [[classifier]] is obtained by first estimating the [[Conditional Probability Function|class-conditional densities]], then classifying each new [[data point]] to the [[class]] with highest [[posterior probability]]. This approach is called [[Generative Classification Algorithm|<i>generative</i> classification]].


----
----

Latest revision as of 17:51, 4 October 2023

A Generative Classification Algorithm is a generative learning algorithm that can solve a supervised classification task.



References

2004