Distributed coding for data representation ofback-propagation neural network classifiers
- 12 October 1995
- journal article
- Published by Institution of Engineering and Technology (IET) in Electronics Letters
- Vol. 31 (21) , 1852-1854
- https://doi.org/10.1049/el:19951244
Abstract
A new distributed input coding is derived by distributing the feature variables over a number of input nodes based on the distribution of the training data. Using this coding method representation, the range of each input node will be fully optimised; this enables the network to converge at a higher rate during training. The coding method also enables the network to maintain the generalisation capability of conventional normalisation coding.Keywords
This publication has 2 references indexed in Scilit:
- Distributed normalisation input coding to speedup training processof BP-neural network classifierElectronics Letters, 1995
- Theory of the backpropagation neural networkPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989