Abstract
Generalization properties of multilayered neural networks with binary couplings are studied in the high-temperature limit. The transition to the perfect generalization phase is evaluated for systems with an arbitrary number of layers. It is found that the thermodynamic transition occurs for a number of examples lower than for the perceptron, but the opposite occurs for the transition in which the poor generalization solution disappears. The generalization error is also decomposed according to the contributions coming from different numbers of hidden neurons that have a wrong sign in the internal field. This allows the authors to describe the generalization behaviour of the hidden neurons.