Abstract
A Search for the stochastic complexity of the observed data, as the greates lower bound with which the data can be encoded, represents a global maximum likelihood principle, which permits comparison of models regardless of the number of parameters in them. For important special classes, such as the guassian and the multinomial models, formulas for the stochastic complexity give new and powerful model selection criteria,while in the general case approximations can be computed with the MDL principle. Once a model is found with which the stochastic complexity is reached, there is nothing further to learn from the data with the proposed models. The basic notions are reviewed and numerical examples are given.

This publication has 14 references indexed in Scilit: