On the Discriminant Vector Method of Feature Selection
- 1 June 1977
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Computers
- Vol. C-26 (6) , 604-606
- https://doi.org/10.1109/tc.1977.1674885
Abstract
The correspondence discusses the relationship of the discriminant vector method of feature selection [1] and the method of Kittler and Young [5]. Although both methods determine the feature space coordinate axes by maximizing the generalized Fisher criterion of discriminatory power, with the exception of two class case the resulting feature spaces are considerably different because of the difference in the constraints imposed on the axes by individual methods. It is shown that the latter method is, from the point of view of dimensionality reduction, more powerful and also computationally more efficient.Keywords
This publication has 6 references indexed in Scilit:
- Discriminative Subspace Method for Minimum Error Pattern RecognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Mathematical methods of feature selection in pattern recognitionInternational Journal of Man-Machine Studies, 1975
- An Optimal Set of Discriminant VectorsIEEE Transactions on Computers, 1975
- A new approach to feature selection based on the Karhunen-Loeve expansionPattern Recognition, 1973
- Application of the Karhunen-Loève Expansion to Feature Selection and OrderingIEEE Transactions on Computers, 1970
- On the generalized Karhunen-Loeve expansion (Corresp.)IEEE Transactions on Information Theory, 1967