Abstract
The effects of silicon implementation on the backpropagation learning rule in artificial neural systems are examined. The effects on learning performance of limited weight resolution, range limitations, and the steepness of the activation function are considered. A minimum resolution of about 20÷22 bits is generally required, but this figure can be reduced to about 14÷15 bits by properly choosing the learning parameter η which attains good performance in presence of limited resolution. This performance can be further improved by using a modified batch backpropagation rule. Theoretical analysis is compared with ad-hoc simulations and results are discussed in detai