Abstract
The minimum description length (MDL) principle is an information theoretically based method to learn models from data. This paper presents an approach to efficiently use an MDL-based cost function with neural networks. As usual, the cost function can be used to adapt the parameters in the network, but it can also include terms to measure the complexity of the structure of the network and can thus be applied to determine the optimal structure. The basic idea is to convert a conventional neural network such that each parameter and each output of the neurons is assigned a means and a variance. This greatly simplifies the computation of the description length and its gradient with respect to the parameters, which can then be adapted using the standard gradient descent method.

This publication has 7 references indexed in Scilit: