# Probabilistic Classification Task

A Probabilistic Classification Task is a supervised classification task that is a probabilistic prediction task (with a probability value requirement).

**AKA:**Class Prediction with a Probability Value.**Context:**- It can (typically) require a Probabilistic Classification Model.
- It can range from being a Fully-Supervised Probabilistic Classification Task to being a Semi-Supervised Probabilistic Classification Task.
- It can range from being a Probabilistic Binary Classification Task to being a Probabilistic Multiclass Classification Task.
- It can be solved by a Probabilistic Classification System (that implements a probabilistic classification algorithm).

**Counter-Example(s):****See:**Machine Learning, Statistical Classification, Probability Distribution, Ensemble Classifier, Conditional Probability, Bayes Estimator, Binomial Regression, Statistics, Econometrics, Discrete Choice, Naive Bayes Classifier, Logistic Regression, Multilayer Perceptron.

## References

### 2014

- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/probabilistic_classification Retrieved:2014-11-10.
- In machine learning, a
**probabilistic classifier**is a classifier that is able to predict, given a sample input, a probability distribution over a set of classes, rather than only predicting a class for the sample. Probabilistic classifiers provide classification with a degree of certainty, which can be useful in its own right, or when combining classifiers into ensembles. Formally, a probabilistic classifier is a conditional distribution [math]\operatorname{P}(Y \vert X)[/math] over a finite set of classes , given inputs . Deciding on the best class label [math]\hat{y}[/math] for can then be done using the optimal decision rule^{[1]}:[math]\hat{y} = \operatorname{\arg\max}_{y} \operatorname{P}(Y=y \vert X)[/math]Binary probabilistic classifiers are also called binomial regression models in statistics. In econometrics, probabilistic classification in general is called discrete choice.

Some classification models, such as naive Bayes, logistic regression and multilayer perceptrons (when trained under an appropriate loss function) are naturally probabilistic. Other models such as support vector machines are not, but methods exist to turn them into probabilistic classifiers.

- In machine learning, a

- ↑ Cite error: Invalid
`<ref>`

tag; no text was provided for refs named`bishop`