Gradient Descent Boosting-based Decision Tree (GDBT) Learning Algorithm

Revision as of 18:52, 24 January 2024 by Gmelli (talk | contribs) (Created page with "A Gradient Descent Boosting-based Decision Tree (GDBT) Learning Algorithm is a gradient descent boosting-based learning algorithm that utilizes gradient descent optimization in conjunction with decision tree models, where each tree is trained to correct the errors of the previous ones. * <B>Context:</B> ** It can (typically) create a Decision Tree Ensemble. ** It can be implemented by a GBDT System (that solves a GBDT task). ** It can range fr...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

A Gradient Descent Boosting-based Decision Tree (GDBT) Learning Algorithm is a gradient descent boosting-based learning algorithm that utilizes gradient descent optimization in conjunction with decision tree models, where each tree is trained to correct the errors of the previous ones.


References

2018



References

2018

  • (Prokhorenkova, et al., 2018) ⇒ [[::Liudmila Prokhorenkova]], [[::Gleb Gusev]], [[::Aleksandr Vorobev]], [[::Anna Veronika Dorogush]], and [[::Andrey Gulin]]. ([[::2018]]). “CatBoost: Unbiased Boosting with Categorical Features.” In: Proceedings of advances in neural information processing systems 31

2009

  • http://www.dtreg.com/treeboost.htm
    • TreeBoost - Stochastic Gradient Boosting
    • "Boosting" is a technique for improving the accuracy of a predictive function by applying the function repeatedly in a series and combining the output of each function with weighting so that the total error of the prediction is minimized. In many cases, the predictive accuracy of such a series greatly exceeds the accuracy of the base function used alone.
    • The TreeBoost algorithm used by DTREG is optimized for improving the accuracy of models built on decision trees. Research has shown that models built using TreeBoost are among the most accurate of any known modeling technique.
    • The TreeBoost algorithm is functionally similar to Decision Tree Forests because it creates a tree ensemble, and it uses randomization during the tree creations. However, a random forest builds the trees in parallel and they "vote" on the prediction; whereas TreeBoost creates a series of trees, and the prediction receives incremental improvement by each tree in the series.

2001