MetaCost
- 1 August 1999
- proceedings article
- Published by Association for Computing Machinery (ACM)
- p. 155-164
- https://doi.org/10.1145/312129.312220
Abstract
Research in machine learning, statistics and related fields has produced a wide variety of algorithms for classification. However, most of these algorithms assume that all errors have the same cost, which is seldom the case in KDD problems. Individually making each classification learner costsensitive is laborious, and often non-trivial. In this paper we propose a principled method for making an arbitrary classifier cost-sensitive by wrapping a cost-minimizing procedure around it. This...Keywords
This publication has 5 references indexed in Scilit:
- On the Optimality of the Simple Bayesian Classifier under Zero-One LossMachine Learning, 1997
- Bagging PredictorsMachine Learning, 1996
- Neural Networks for Pattern RecognitionPublished by Oxford University Press (OUP) ,1995
- Instance-Based Learning AlgorithmsMachine Learning, 1991
- A theory and methodology of inductive learningArtificial Intelligence, 1983