Reducing neuron gain to eliminate fixed-point attractors in an analog associative memory
- 1 March 1991
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 43 (6) , 3131-3142
- https://doi.org/10.1103/physreva.43.3131
Abstract
We show analytically that the expected number of fixed-point attractors in an analog associative memory neural network increases exponentially with network size with a scaling exponent that depends on the ratio of stored memories to neurons and on the maximum slope, or gain, of the neuron transfer function. The scaling exponent decreases with gain for a sigmoidal transfer function, indicating that gain reduction can improve computational performance by eliminating spurious fixed points. Numerical data based on fixed-point counts in small networks support the analytical results.Keywords
This publication has 38 references indexed in Scilit:
- Metastable states of the naive mean-field model for spin glasses at finite temperaturesJournal of Physics A: General Physics, 1990
- Metastable states of neural networks incorporating the physiological Dale hypothesisJournal of Physics A: General Physics, 1990
- Statistical mechanics of stereoscopic visionPhysical Review A, 1989
- On the 'naive' mean-field equations for spin glassesJournal of Physics C: Solid State Physics, 1986
- On the statistical mechanics of the traveling salesman problemJournal of Statistical Physics, 1986
- Irreversibility and metastability in spin-glasses. I. Ising modelPhysical Review B, 1983
- Optimization by Simulated AnnealingScience, 1983
- Metastable states in spin glasses with short-ranged interactionsJournal of Physics C: Solid State Physics, 1981
- White and weighted averages over solutions of Thouless Anderson Palmer equations for the Sherrington Kirkpatrick spin glassJournal de Physique, 1980
- Solution of 'Solvable model of a spin glass'Philosophical Magazine, 1977