An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses
- 1 January 1989
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 191-196 vol.2
- https://doi.org/10.1109/ijcnn.1989.118698
Abstract
The use of floating-gate nonvolatile memory technology for analog storage of connection strengths, or weights, has previously been proposed and demonstrated. The authors report the analog storage and multiply characteristics of a new floating-gate synapse and further discuss the architecture of a neural network which uses this synapse cell. In the architecture described 8192 synapses are used to interconnect 64 neurons fully and to connect the 64 neurons to each of 64 inputs. Each synapse in the network multiplies a signed analog voltage by a stored weight and generates a differential current proportional to the product. Differential currents are summed on a pair of bit lines and transferred through a sigmoid function, appearing at the neuron output as an analog voltage. Input and output levels are compatible for ease in cascade-connecting these devices into multilayer networks. The width and height of weight-change pulses are calculated. The synapse cell size is 2009 mu m/sup 2/ using 1- mu m CMOS EEPROM technology.<>Keywords
This publication has 5 references indexed in Scilit:
- Reliability performance of ETOX based flash memoriesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Self-Organization and Associative MemoryPublished by Springer Nature ,1988
- Computing with Neural Circuits: A ModelScience, 1986
- A four-quadrant NMOS analog multiplierIEEE Journal of Solid-State Circuits, 1982
- MEMORY BEHAVIOR IN A FLOATING-GATE AVALANCHE-INJECTION MOS (FAMOS) STRUCTUREApplied Physics Letters, 1971