Decision Threshold

(Redirected from decision threshold)
Jump to navigation Jump to search

A Decision Threshold is a value above which an instance can be considered to belong to the positive class.

See: Threshold, Decision Function, ROC Curve, Classification Algorithm, Classification Tree, Probability Threshold.





  • (Chawla et al., 2004) ⇒ Nitesh Chawla, Nathalie Japkowicz, and Aleksander Kolcz. (2004). “Editorial: Special issue on learning from imbalanced data sets.” In: ACM SIGKDD Explorations Newsletter, 6(1). doi:10.1145/1007730.1007733 PDF
    • QUOTE: At the data level, these solutions include many different forms of re-sampling such as random oversampling with replacement, random undersampling, directed oversampling (in which no new examples are created, but the choice of samples to replace is informed rather than random), directed undersampling (where, again, the choice of examples to eliminate is informed), oversampling with informed generation of new samples, and combinations of the above techniques. At the algorithmic level, solutions include adjusting the costs of the various classes so as to counter the class imbalance, adjusting the probabilistic estimate at the tree leaf (when working with decision trees), adjusting the decision threshold, and recognition-based (i.e., learning from one class) rather than discrimination-based (two class) learning.