Regularized Gaussian Discriminant Analysis Through Eigenvalue Decomposition
- 1 December 1996
- journal article
- research article
- Published by JSTOR in Journal of the American Statistical Association
- Vol. 91 (436) , 1743
- https://doi.org/10.2307/2291604
Abstract
Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σ k ] of a group Gk in terms of its eigenvalue decomposition Σ k = λ k D k A k D k ′, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes orientations λ k , A k , and D k lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and rea...Keywords
This publication has 0 references indexed in Scilit: