A stochastic architecture for neural nets
- 1 January 1988
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 481-488 vol.1
- https://doi.org/10.1109/icnn.1988.23882
Abstract
A stochastic digital architecture is described for simulating the operation of Hopfield neural networks. This architecture provides reprogrammability (since synaptic weights are stored in digital shift registers), large dynamic range (by using either fixed or floating-point weights), annealing (by coupling variable neuron gains with noise from stochastic arithmetic), high execution speeds ( approximately=N*10/sup 8/ connections per second), expandability (by cascading of multiple chips to host large networks), and practicality (by building with very conservative MOS device technologies).Keywords
This publication has 4 references indexed in Scilit:
- Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuitIEEE Transactions on Circuits and Systems, 1986
- VLSI implementation of a neural network memory with several hundreds of neuronsAIP Conference Proceedings, 1986
- Digital signal processor accelerators for neural network simulationsAIP Conference Proceedings, 1986
- An Efficient Heuristic Procedure for Partitioning GraphsBell System Technical Journal, 1970