Recognition rates of the Hebb rule for learning Boolean functions
- 1 March 1990
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 41 (6) , 3059-3065
- https://doi.org/10.1103/physreva.41.3059
Abstract
We study the Hebb rule for learning several Boolean functions (random and linearly separable functions) defined on the hypercube of dimension N. Learning and generalization rates are derived in the N→∞ limit versus α=P/N, where P is the number of learned patterns. In the linearly separable case, the generalization rate grows monotonically from 1/2 to 1, whereas the learning rate first decreases from 1 to a minimum value, and then increases again towards 1. This result is interpreted as an interference phenomenon, like in the learning for associative memories implemented with the same rule. Comparisons are then made with the case of random Boolean functions, associative memories, and their clipped version. The behavior of the Hebb rule is decomposed in two distinct contributions, referred to as the ‘‘rote’’ and the ‘‘conceptual’’ learnings. Illustrative numerical simulations are given.Keywords
This publication has 11 references indexed in Scilit:
- Linear and Nonlinear Extension of the Pseudo-Inverse Solution for Learning Boolean FunctionsEurophysics Letters, 1989
- The Hebb Rule for Learning Linearly Separable Boolean Functions: Learning and GeneralizationEurophysics Letters, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Optimal storage properties of neural network modelsJournal of Physics A: General Physics, 1988
- Maximum Storage Capacity in Neural NetworksEurophysics Letters, 1987
- Learning algorithms with optimal stability in neural networksJournal of Physics A: General Physics, 1987
- Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural NetworksPhysical Review Letters, 1985
- Information storage and retrieval in spin-glass like neural networksJournal de Physique Lettres, 1985
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- The Perceptron: A Model for Brain Functioning. IReviews of Modern Physics, 1962