An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses

Abstract
The use of floating-gate nonvolatile memory technology for analog storage of connection strengths, or weights, has previously been proposed and demonstrated. The authors report the analog storage and multiply characteristics of a new floating-gate synapse and further discuss the architecture of a neural network which uses this synapse cell. In the architecture described 8192 synapses are used to interconnect 64 neurons fully and to connect the 64 neurons to each of 64 inputs. Each synapse in the network multiplies a signed analog voltage by a stored weight and generates a differential current proportional to the product. Differential currents are summed on a pair of bit lines and transferred through a sigmoid function, appearing at the neuron output as an analog voltage. Input and output levels are compatible for ease in cascade-connecting these devices into multilayer networks. The width and height of weight-change pulses are calculated. The synapse cell size is 2009 mu m/sup 2/ using 1- mu m CMOS EEPROM technology.<>

This publication has 5 references indexed in Scilit: