1996 ExperimentswithaNewBoostingAlgo

From GM-RKB
(Redirected from Freund and Schapire, 1996)
Jump to navigation Jump to search

Subject Headings: AdaBoost Algorithm.

Notes

Cited By

Quotes

Abstract

In an earlier paper [9], we introduced a new “boosting” algorithm called AdaBoost which, theoretically, can be used to significantly reduce the error of any learning algorithm that consistently generates classifiers whose performance is a little better than random guessing. We also introduced the related notion of a “pseudo-loss” which is a method for forcing a learning algorithm of multi-label concepts to concentrate on the labels that are hardest to discriminate. In this paper, we describe experiments we carried out to assess how well AdaBoost with and without pseudo-loss, performs on real learning problems.

We performed two sets of experiments. The first set compared boosting to [[Breiman’s [1] “bagging” method]] when used to aggregate various classifiers (including decision trees and single attribute-value tests). We compared the performance of the two methods on a collection of machine-learning benchmarks. In the second set of experiments, we studied in more detail the performance of boosting using a nearest-neighbor classifier on an OCR problem..



References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
1996 ExperimentswithaNewBoostingAlgoYoav Freund
Robert E. Schapire
Experiments with a New Boosting Algorithm1996