Partially connected models of neural networks

Abstract
A partially connected Hopfield neural network model is studied under the restriction that w, the ratio of connections per site to the size of the system, remains finite as the size N to infinity with the connection structure at each site being the same. The replica symmetric mean field theory equations for the order parameters are derived. The zero-temperature forms of these equations are then solved numerically for a few different 'local' connectivity architectures showing phase transitions at different critical storage ratios, alpha c, where the states which the authors are trying to store in the network become discontinuously unstable. They show that the information capacity per connection improves for partially connected systems.

This publication has 18 references indexed in Scilit: