Probabilistic analysis of a learning matrix
- 1 December 1988
- journal article
- Published by Cambridge University Press (CUP) in Advances in Applied Probability
- Vol. 20 (4) , 695-705
- https://doi.org/10.2307/1427355
Abstract
A learning matrix is defined by a set of input and output pattern vectors. The entries in these vectors are zeros and ones. The matrix is the maximum of the outer products of the input and output pattern vectors. The entries in the matrix are also zeros and ones. The product of this matrix with a selected input pattern vector defines an activity vector. It is shown that when the patterns are taken to be random, then there are central limit and large deviation theorems for the activity vector. They give conditions for when the activity vector may be used to reconstruct the output pattern vector corresponding to the selected input pattern vector.Keywords
This publication has 8 references indexed in Scilit:
- Asymptotic independence and limit theorems for positively and negatively dependent random variablesPublished by Institute of Mathematical Statistics ,1984
- A Berry-Esseen Theorem for Associated Random VariablesThe Annals of Probability, 1983
- Nearest neighbors and Voronoi regions in certain point processesAdvances in Applied Probability, 1983
- An Invariance Principle for Certain Dependent SequencesThe Annals of Probability, 1981
- On the storage capacity of an associative memory with randomly distributed storage elementsBiological Cybernetics, 1981
- Normal fluctuations and the FKG inequalitiesCommunications in Mathematical Physics, 1980
- On associative memoryBiological Cybernetics, 1980
- Die LernmatrixBiological Cybernetics, 1961