Partially connected models of neural networks
- 7 August 1988
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 21 (15) , 3275-3284
- https://doi.org/10.1088/0305-4470/21/15/016
Abstract
A partially connected Hopfield neural network model is studied under the restriction that w, the ratio of connections per site to the size of the system, remains finite as the size N to infinity with the connection structure at each site being the same. The replica symmetric mean field theory equations for the order parameters are derived. The zero-temperature forms of these equations are then solved numerically for a few different 'local' connectivity architectures showing phase transitions at different critical storage ratios, alpha c, where the states which the authors are trying to store in the network become discontinuously unstable. They show that the information capacity per connection improves for partially connected systems.Keywords
This publication has 18 references indexed in Scilit:
- Statistical mechanics of neural networks near saturationPublished by Elsevier ,2004
- Content-addressability and learning in neural networksJournal of Physics A: General Physics, 1988
- An Exactly Solvable Asymmetric Neural Network ModelEurophysics Letters, 1987
- Dynamics and statistical mechanics of the Hopfield modelJournal of Physics A: General Physics, 1987
- Information storage in neural networks with low levels of activityPhysical Review A, 1987
- Saturation Level of the Hopfield Model for Neural NetworkEurophysics Letters, 1986
- Learning and memory properties in fully connected networksAIP Conference Proceedings, 1986
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Spin-glass models of neural networksPhysical Review A, 1985
- Theory of spin glassesJournal of Physics F: Metal Physics, 1975