Information theory, complexity and neural networks
- 1 November 1989
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Communications Magazine
- Vol. 27 (11) , 25-28
- https://doi.org/10.1109/35.41397
Abstract
Some of the main results in the mathematical evaluation of neural networks as information processing systems are discussed. The basic operation of feedback and feed-forward neural networks is described. Their memory capacity and computing power are considered. The concept of learning by example as it applies to neural networks is examined.Keywords
This publication has 15 references indexed in Scilit:
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Convergence results in an associative memory modelNeural Networks, 1988
- On the capabilities of multilayer perceptronsJournal of Complexity, 1988
- On the complexity of loading shallow neural networksJournal of Complexity, 1988
- The capacity of the Hopfield associative memoryIEEE Transactions on Information Theory, 1987
- Optical Neural ComputersScientific American, 1987
- Information capacity of the Hopfield modelIEEE Transactions on Information Theory, 1985
- A theory of the learnableCommunications of the ACM, 1984
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- On the Uniform Convergence of Relative Frequencies of Events to Their ProbabilitiesTheory of Probability and Its Applications, 1971