Abstract
An all-inhibitory network which learns by selective disconnection of synapses is described. This is similar to an ‘associative net'; however, it is simpler in that its neurons do not need to perform arithmetical operations, and the net does not require additional threshold modulating neurons in order to cope with input patterns which are incomplete, or of differing sizes. This fundamental simplicity permits a greater variety and density of connections. These can multiply the capacity of the nets to learn complex sequences of patterns without being saturated. An “all-connected‘’ net is described which has the holograph-like capacity to reconstruct the whole of an input pattern from part patterns without involving delays or threshold devices. All of these inhibitory nets can construct themselves by means of simple random growth processes, without incurring any loss of learning capacity of holographic properties. Similarly, synapses can be allowed to potentiate with use, so that reaction times are progressively reduced by practice, without any reduction in the quality of the performance. Inhibitory connections between arrays can give patterns in one array control over the allocation of channels in which lower arrays store learned information. A description is given of a model, decentralised, inhibitory hierarchy consisting of inter-connected arrays which can learn to execute goal-directed TOTE-type programs of behaviour by means of a simple ‘putting-through’ procedure.

This publication has 7 references indexed in Scilit: