Boosted Decision Tree Algorithm: Difference between revisions
Jump to navigation
Jump to search
m (Text replacement - " " to " ") |
m (Text replacement - "> ↵" to "> ") |
||
Line 2: | Line 2: | ||
* <B>Context:</B> | * <B>Context:</B> | ||
** It can be implemented in a [[Boosted Decision Tree System]] (that can solve a [[Boosted decision tree task]]). | ** It can be implemented in a [[Boosted Decision Tree System]] (that can solve a [[Boosted decision tree task]]). | ||
* <B>Example(s):</B> | * <B>Example(s):</B> | ||
** [[AdaBoost]]. | ** [[AdaBoost]]. | ||
** a [[XGBoost Algorithm]] (for [[XGBoost]])> | ** a [[XGBoost Algorithm]] (for [[XGBoost]])> | ||
** a [[MART Algorithm]] (for [[MART]])> | ** a [[MART Algorithm]] (for [[MART]])> | ||
* <B>Counter-Example(s):</B> | * <B>Counter-Example(s):</B> | ||
** [[LogitBoost Algorithm]]. | ** [[LogitBoost Algorithm]]. | ||
** [[Gradient Boosted Trees]]. | ** [[Gradient Boosted Trees]]. | ||
Line 19: | Line 19: | ||
=== 2016 === | === 2016 === | ||
* ([[2016_TreeBoostingWithXGBoostWhyDoesX|Nielsen, 2016]]) ⇒ [[Didrik Nielsen]]. ([[2016]]). “[https://brage.bibsys.no/xmlui/bitstream/handle/11250/2433761/16128_FULLTEXT.pdf Tree Boosting With XGBoost-Why Does XGBoost Win" Every" Machine Learning Competition?].” | * ([[2016_TreeBoostingWithXGBoostWhyDoesX|Nielsen, 2016]]) ⇒ [[Didrik Nielsen]]. ([[2016]]). “[https://brage.bibsys.no/xmlui/bitstream/handle/11250/2433761/16128_FULLTEXT.pdf Tree Boosting With XGBoost-Why Does XGBoost Win" Every" Machine Learning Competition?].” | ||
** QUOTE: [[Boosted Decision Tree Algorithm|Tree boosting]] has [[empirically proven]] to be a highly [[effective approach]] to [[predictive modeling]]. </s> [[Boosted Decision Tree Algorithm|It]] has shown remarkable results for a [[vast array of problem]]s. </s> | ** QUOTE: [[Boosted Decision Tree Algorithm|Tree boosting]] has [[empirically proven]] to be a highly [[effective approach]] to [[predictive modeling]]. </s> [[Boosted Decision Tree Algorithm|It]] has shown remarkable results for a [[vast array of problem]]s. </s> | ||
=== 2006 === | === 2006 === |
Latest revision as of 02:43, 27 March 2024
A Boosted Decision Tree Algorithm is a boosting algorithm that uses a decision tree learning algorithm.
- Context:
- It can be implemented in a Boosted Decision Tree System (that can solve a Boosted decision tree task).
- Example(s):
- AdaBoost.
- a XGBoost Algorithm (for XGBoost)>
- a MART Algorithm (for MART)>
- Counter-Example(s):
- See: Boosted Lasso.
References
2016
- (Nielsen, 2016) ⇒ Didrik Nielsen. (2016). “Tree Boosting With XGBoost-Why Does XGBoost Win" Every" Machine Learning Competition?.”
- QUOTE: Tree boosting has empirically proven to be a highly effective approach to predictive modeling. It has shown remarkable results for a vast array of problems.
2006
- (Caruana & Niculescu-Mizil, 2006) ⇒ Rich Caruana, and Alexandru Niculescu-Mizil. (2006). “An Empirical Comparison of Supervised Learning Algorithms.” In: Proceedings of the 23rd International Conference on Machine learning. ISBN:1-59593-383-2 doi:10.1145/1143844.1143865
- QUOTE: A number of supervised learning methods have been introduced in the last decade. Unfortunately, the last comprehensive empirical evaluation of supervised learning was the Statlog Project in the early 90's. We present a large-scale empirical comparison between ten supervised learning methods: SVMs, neural nets, logistic regression, naive bayes, memory-based learning, random forests, decision trees, bagged trees, boosted trees, and boosted stumps. We also examine the effect that calibrating the models via Platt Scaling and Isotonic Regression has on their performance.