Analysis and synthesis of a class of discrete-time neural networks described on hypercubes
- 1 January 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 2 (1) , 32-46
- https://doi.org/10.1109/72.80289
Abstract
A qualitative analysis is presented for a class of synchronous discrete-time neural networks defined on hypercubes in the state space. Analysis results are utilized to establish a design procedure for associative memories to be implemented on the present class of neural networks. To demonstrate the storage ability and flexibility of the synthesis procedure, several specific examples are considered. The design procedure has essentially the same desirable features as the results of J. Li et al. (1988, 1989) for continuous-time neural networks. For a given system dimension, networks designed by the present method may have the ability to store more patterns (as asymptotically stable equilibria) than corresponding discrete-time networks designed by other techniques. The design method guarantees the storage of all the desired patterns as asymptotically stable equilibrium points. The present method provides guidelines for reducing the number of spurious states and for estimating the extent of the patterns' domains of attraction. The present results provide a means of implementing neural networks by serial processors and special digital hardware.Keywords
This publication has 13 references indexed in Scilit:
- A synthesis procedure for Hopfield's continuous-time associative memoryIEEE Transactions on Circuits and Systems, 1990
- Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercubeIEEE Transactions on Circuits and Systems, 1989
- Analysis and synthesis of a class of neural networks: variable structure systems with infinite grainIEEE Transactions on Circuits and Systems, 1989
- Qualitative analysis and synthesis of a class of neural networksIEEE Transactions on Circuits and Systems, 1988
- Collective computational properties of neural networks: New learning mechanismsPhysical Review A, 1986
- Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuitIEEE Transactions on Circuits and Systems, 1986
- Information capacity of the Hopfield modelIEEE Transactions on Information Theory, 1985
- Information storage and retrieval in spin-glass like neural networksJournal de Physique Lettres, 1985
- Neurons with graded response have collective computational properties like those of two-state neurons.Proceedings of the National Academy of Sciences, 1984
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982