On the capacity of neural networks with binary weights

Abstract
The authors study the critical capacity ( alpha c) of multilayered networks with binary couplings. They show that, for any network presenting a tree-like architecture after the first hidden layer, no fixed internal representation is required. Using Gardner's calculations, they apply statistical mechanics to the simplest network with two layers of adaptive weights. Following the same approach as for the binary perceptron they find from the zero-entropy point a critical capacity alpha c=0.92. They discuss the validity of this result in view of exhaustive search simulations on small networks.

This publication has 9 references indexed in Scilit: