Attractive Periodic Sets in Discrete-Time Recurrent Networks (with Emphasis on Fixed-Point Stability and Bifurcations in Two-Neuron Networks)
Open Access
- 1 June 2001
- journal article
- Published by MIT Press in Neural Computation
- Vol. 13 (6) , 1379-1414
- https://doi.org/10.1162/08997660152002898
Abstract
We perform a detailed fixed-point analysis of two-unit recurrent neural networks with sigmoid-shaped transfer functions. Using geometrical arguments in the space of transfer function derivatives, we partition the network state-space into distinct regions corresponding to stability types of the fixed points. Unlike in the previous studies, we do not assume any special form of connectivity pattern between the neurons, and all free parameters are allowed to vary. We also prove that when both neurons have excitatory self-connections and the mutual interaction pattern is the same (i.e., the neurons mutually inhibit or excite themselves), new attractive fixed points are created through the saddle-node bifurcation. Finally, for an N-neuron recurrent network, we give lower bounds on the rate of convergence of attractive periodic points toward the saturation values of neuron activations, as the absolute values of connection weights grow.Keywords
This publication has 24 references indexed in Scilit:
- Dynamical features simulated by recurrent neural networksNeural Networks, 1999
- The Dynamics of Discrete-Time Computation, with Application to Recurrent Neural Networks and Finite State Machine ExtractionNeural Computation, 1996
- On the Dynamics of Small Continuous-Time Recurrent Neural NetworksAdaptive Behavior, 1995
- Saturation at high gain in discrete time recurrent networksNeural Networks, 1994
- Stability of fixed points and periodic orbits and bifurcations in analog neural networksNeural Networks, 1992
- Learning and Extracting Finite State Automata with Second-Order Recurrent Neural NetworksNeural Computation, 1992
- Bifurcation analysis of a neural network modelBiological Cybernetics, 1992
- Dynamical analysis of the brain-state-in-a-box (BSB) neural modelsIEEE Transactions on Neural Networks, 1992
- Finite State Automata and Simple Recurrent NetworksNeural Computation, 1989
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984