1999 MetaCostAGeneralMethodforMaking

From GM-RKB
Jump to navigation Jump to search

Subject Headings: MetaCost

Notes

Cited By

Quotes

Abstract

Research in machine learning, statistics and related fields has produced a wide variety of algorithms for classification. However, most of these algorithms assume that all errors have the same cost, which is seldom the case in KDD problems. Individually making each classification learner cost-sensitive is laborious, and often non-trivial. In this paper we propose a principled method for making an arbitrary classifier cost-sensitive by wrapping a cost-minimizing procedure around it. This procedure, called MetaCost, treats the underlying classifier as a black box, requiring no knowledge of its functioning or change to it. Unlike stratification, MetaCost is applicable to any number of classes and to arbitrary cost matrices. Empirical trials on a large suite of benchmark databases show that MetaCost almost always produces large cost reductions compared to the cost-blind classifier used (C4.5RULES) and to two forms of stratification. Further tests identify the key components of MetaCost and those that can be varied without substantial loss. Experiments on a larger database indicate that MetaCost scales well.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
1999 MetaCostAGeneralMethodforMakingPedro DomingosMetaCost: A General Method for Making Classifiers Cost-sensitive10.1145/312129.3122201999