Abstract
The authors develop a mathematical model of the effects of synaptic arithmetic noise in multilayer perceptron training. Predictions are made regarding enhanced fault-tolerance and generalization ability and improved learning trajectory. These predictions are subsequently verified by simulation. The results are perfectly general and have profound implications for the accuracy requirements in multilayer perceptron (MLP) training, particularly in the analog domain.