Principal feature classification
- 1 January 1997
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 8 (1) , 155-160
- https://doi.org/10.1109/72.554200
Abstract
The concept, structures, and algorithms of principal feature classification (PFC) are presented in this paper. PFC is intended to solve complex classification problems with large data sets. A PFC network is designed by sequentially finding principal features and removing training data which has already been correctly classified. PFC combines advantages of statistical pattern recognition, decision trees, and artificial neural networks (ANNs) and provides fast learning with good performance and a simple network structure. For the real-world applications of this paper, PFC provides better performance than conventional statistical pattern recognition, avoids the long training times of backpropagation and other gradient-descent algorithms for ANNs, and provides a low-complexity structure for realization.Keywords
This publication has 6 references indexed in Scilit:
- Artificial neural networks for feature extraction and multivariate data projectionIEEE Transactions on Neural Networks, 1995
- Maximum likelihood training of probabilistic neural networksIEEE Transactions on Neural Networks, 1994
- Growing and pruning neural tree networksIEEE Transactions on Computers, 1993
- Pruning algorithms-a surveyIEEE Transactions on Neural Networks, 1993
- Orthogonal least squares learning algorithm for radial basis function networksIEEE Transactions on Neural Networks, 1991
- THE STATISTICAL UTILIZATION OF MULTIPLE MEASUREMENTSAnnals of Eugenics, 1938