Abstract
Reviews a class of neural networks often termed 'RAM-based networks'. As this paper shows, the networks are identified by their use of 'logical' 1-in-n decoders as a pre-process to each neuron. The paper explains why the networks have also been termed weightless systems. Two sub-classes of binary neural networks are described, those which use binary weights and use only a single layer of neurons [consisting of the multi-RAM discriminator (MRD), ADAM multiprocessor and WISARD pattern recognition machine] and those which use multi-valued weights and multiple layers of neurons [comprising the probabilistic logic node (PLN), probabilistic RAM (pRAM), goal-seeking neuron (GSN), and time-integrating neuron (TIN) networks]. The paper attempts to show the evolution of the networks, as well as describing the benefits of this class of neural network for hardware implementation.

This publication has 16 references indexed in Scilit: