Bias/Variance Decompositions for Likelihood-Based Estimators
- 1 August 1998
- journal article
- Published by MIT Press in Neural Computation
- Vol. 10 (6) , 1425-1433
- https://doi.org/10.1162/089976698300017232
Abstract
The bias/variance decomposition of mean-squared error is well understood and relatively straightforward. In this note, a similar simple decomposition is derived, valid for any kind of error measure that, when using the appropriate probability model, can be derived from a Kullback-Leibler divergence or log-likelihood.Keywords
This publication has 8 references indexed in Scilit:
- On Bias Plus VarianceNeural Computation, 1997
- Using Neural Networks to Model Conditional Multivariate DensitiesNeural Computation, 1996
- Methods For Combining Experts' Probability AssessmentsNeural Computation, 1995
- Solving Multiclass Learning Problems via Error-Correcting Output CodesJournal of Artificial Intelligence Research, 1995
- Neural Networks and the Bias/Variance DilemmaNeural Computation, 1992
- On Kullback-Leibler Loss and Density EstimationThe Annals of Statistics, 1987
- Combining Probability Distributions: A Critique and an Annotated BibliographyStatistical Science, 1986
- A Multiplicative Formula for Aggregating Probability AssessmentsManagement Science, 1982