On learning rules and memory storage abilities of asymmetrical neural networks
- 1 January 1988
- journal article
- Published by EDP Sciences in Journal de Physique
- Vol. 49 (5) , 711-726
- https://doi.org/10.1051/jphys:01988004905071100
Abstract
Most models of memory proposed so far use symmetric synapses. We show that this assumption is not necessary for a neural network to display memory abilities. We present an analytical derivation of memory capacities which does not appeal to the replica technique. It only uses a more transparent and straightforward mean-field approximation. The memorization efficiency depends on four learning parameters which, if the case arises, can be related to datas provided by experiments carried out on real synapses. We show that the learning rules observed so far are fully compatible with memorization capacitiesKeywords
This publication has 9 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationPublished by Elsevier ,2004
- Temporal Association in Asymmetric Neural NetworksPhysical Review Letters, 1986
- Saturation Level of the Hopfield Model for Neural NetworkEurophysics Letters, 1986
- Remanent magnetization of the infinite-range Ising spin glassPhysical Review B, 1986
- SK Model: The Replica Solution without ReplicasEurophysics Letters, 1986
- Stochastic Dynamics of Neural NetworksIEEE Transactions on Systems, Man, and Cybernetics, 1986
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- The effects of early visual experience on the cat's visual cortex and their possible explanation by Hebb synapses.The Journal of Physiology, 1981