Parallel, self-organizing, hierarchical neural networks
- 1 June 1990
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 1 (2) , 167-178
- https://doi.org/10.1109/72.80229
Abstract
A new neural-network architecture called the parallel, self-organizing, hierarchical neural network (PSHNN) is presented. The new architecture involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each stage, error detection is carried out, and a number of input vectors are rejected. Between two stages there is a nonlinear transformation of input vectors rejected by the previous stage. The new architecture has many desirable properties, such as optimized system complexity (in the sense of minimized self-organizing number of stages), high classification accuracy, minimized learning and recall times, and truly parallel architectures in which all stages operate simultaneously without waiting for data from other stages during testing. The experiments performed indicated the superiority of the new architecture over multilayered networks with back-propagation trainingKeywords
This publication has 6 references indexed in Scilit:
- Neural Network Approaches Versus Statistical Methods in Classification of Multisource Remote Sensing DataPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Fast algorithms for the real discrete Fourier transformPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Learning in feedforward layered networks: the tiling algorithmJournal of Physics A: General Physics, 1989
- Neural networks for computation: number representations and programming complexityApplied Optics, 1986
- Real discrete Fourier transformIEEE Transactions on Acoustics, Speech, and Signal Processing, 1985
- ForewordIEEE Transactions on Geoscience and Remote Sensing, 1980