Storage capacity and generalization error for the reversed-wedge Ising perceptron

Abstract
Using the replica formalism, we evaluate the storage capacity and the generalization error of a perceptron with a reversed-wedge transfer function and binary synaptic weights. Remarkably, both the storage capacity and the generalization threshold saturate the information theoretic (respectively upper and lower) bound α=1 for a specific choice of the width of the reverse wedge, suggesting that this perceptron may be an interesting building block for neural networks.

This publication has 20 references indexed in Scilit: