Unsupervised dynamic learning in layered neural networks

Abstract
The authors consider a stochastic two-layer neural network of binary neurons in which the connections between the layers are updated according to the Hebb rule, whereas the lateral connections in the output layer are modified according to an anti-Hebb rule. In equilibrium the output overlap is found to be a linear transformation of the input overlap. Next they extend the model by considering learning as a dynamic process, which means that synaptic efficacies as well as neuronal states may vary in time. Despite the coupling of these two variables, they show that in this particular model the behaviour can be well analysed. It turns out that the network filters the information available at the input in such a way that important components of the input data can pass through, whereas components with a low information content are suppressed.

This publication has 13 references indexed in Scilit: