Hebbian plasticity in mos synapses

Abstract
Hebbian learning in analogue CMOS synapses is obtained by using the transistor characteristics to approximate the multiplicative correlation of neural signals. In situ analogue learning is employed, which means that computations of synaptic weight changes occur continuously during the normal operation of the artificial neural network. The transistor complexity of a synapse is minimised by departing from strict adherence to classical multiplicative rules; learning remains consistent, however, with the original qualitative statement of Hebb. Simulations of circuits with three transistors per synapse in the case of unipolar weights suggest that appropriate learning and forgetting behaviour is obtained at the synaptic level by adopting these area-efficient MOS learning rules in lieu of classical analytical formulations. The theory at the systems level corresponding to these learning rules has not yet been developed.

This publication has 2 references indexed in Scilit: