AdaBoost Algorithm

From GM-RKB
(Redirected from AdaBoost algorithm)
Jump to navigation Jump to search

An AdaBoost Algorithm is a boosted trees algorithm that is an iterative boosting algorithm where each iteration assigns a weight to each learning record equal to the current error [math]\displaystyle{ E(F_{t-1}(x_i)) }[/math] on that record.



References

2014

2013

  • http://en.wikipedia.org/wiki/AdaBoost
    • AdaBoost, short for Adaptive Boosting, is a machine learning algorithm, formulated by Yoav Freund and Robert Schapire.[1] It is a meta-algorithm, and can be used in conjunction with many other learning algorithms to improve their performance. AdaBoost is adaptive in the sense that subsequent classifiers built are tweaked in favor of those instances misclassified by previous classifiers. AdaBoost is sensitive to noisy data and outliers. In some problems, however, it can be less susceptible to the overfitting problem than most learning algorithms. The classifiers' uses can be weak (i.e., display a substantial error rate), but as long as their performance is slightly better than random (i.e. their error rate is smaller than 0.5 for binary classification), they will improve the final model. Even classifiers with an error rate higher than would be expected from a random classifier will be useful, since they will have negative coefficients in the final linear combination of classifiers and hence behave like their inverses.

      AdaBoost generates and calls a new weak classifier in each of a series of rounds [math]\displaystyle{ t = 1,\ldots,T }[/math]. For each call, a distribution of weights [math]\displaystyle{ D_{t} }[/math] is updated that indicates the importance of examples in the data set for the classification. On each round, the weights of each incorrectly classified example are increased, and the weights of each correctly classified example are decreased, so the new classifier focuses on the examples which have so far eluded correct classification.

2011

2009

2002

2001

1999

1997

1996