Increase in Complexity in Random Neural Networks
- 1 March 1995
- journal article
- Published by EDP Sciences in Journal de Physique I
- Vol. 5 (3) , 409-432
- https://doi.org/10.1051/jp1:1995135
Abstract
We study the dynamics of a discrete time, continuous state neural network with random asymmetric couplings and random thresholds. The evolution of the neurons is given in the thermodynamic limit by a set of dynamic mean-field equations obtained by using a local chaos hypothesis. We study the evolution of the mean quadratic distance between two trajectories, and show there exist two different regimes according to the value of the control parameters. In the first one (static regime) two initially close trajectories evolve to the same fixed point, while, in the second one, (chaotic regime) they diverge with an exponential rate, and evolve to a constant, non zero distance. The critical condition for the transition is obtained in a general frame, but, in a specific case, we recover the equation for the De Almeida-Thouless line suggesting strong analogy with the SK model. Besides, the limit for the quadratic distance is the same for all initial conditions choice, showing that ultrametricity occurs in our model. However, we show numerically that this property is not associated to a complex breaking up of the phase space like in the SK model. Besides, the quenched stochastic process giving the evolution of the neurons is a white noise in the thermodynamic limit. The behaviour of our model when crossing the AT line can be characterized by studying the Kolmogorov-Sinai entropy, which exhibits a sharp transition in the thermodynamic limit. This entropy is zero in the static phase, while it becomes infinite in the chaotic regimeKeywords
This publication has 0 references indexed in Scilit: