Learning grey-toned patterns in neural networks
- 21 October 1991
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 24 (20) , 4941-4952
- https://doi.org/10.1088/0305-4470/24/20/023
Abstract
The problem of learning multi-state patterns in neural networks is investigated. An analysis of the space of couplings (Gardner approach) yields the distribution of local fields, the critical storage capacity alpha c and the minimum number of errors for an overloaded network. For noisy local fields the classification error is minimized if the local fields of the patterns are allowed to lie in intervals of finite width. A fast converging, adaptive learning algorithm is presented, which finds the coupling matrix of optimal stability.Keywords
This publication has 19 references indexed in Scilit:
- Finite-state neural networks. A step toward the simulation of very large systemsJournal of Statistical Physics, 1991
- An extremely diluted asymmetric network with graded response neuronsJournal of Physics A: General Physics, 1991
- Storing an extensive number of grey-toned patterns in a neural network using multistate neuronsJournal of Physics A: General Physics, 1990
- Graded-response neurons and information encodings in autoassociative memoriesPhysical Review A, 1990
- The mean-field theory of a Q-state neural network modelJournal of Physics A: General Physics, 1989
- Neural networks that use three-state neuronsJournal of Physics A: General Physics, 1989
- Potts-glass models of neural networksPhysical Review A, 1988
- Optimal storage properties of neural network modelsJournal of Physics A: General Physics, 1988
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Maximum Storage Capacity in Neural NetworksEurophysics Letters, 1987