Abstract
Statistical mechanics relies on the complete though probabilistic description of a system in terms of all the microscopic variables. Its object is to derive therefrom static and dynamic properties involving some reduced set of variables. The elimination of the irrelevant variables is guided by the maximum entropy criterion, which produces the probability law carrying the least amount of information compatible with the relevant variables. This defines relevant entropies which measure the missing information (the disorder) associated with the sole variables retained in an incomplete description. Relevant entropies depend not only on the state of the system but also on the coarseness of its reduced description. Their use sheds light on questions such as the Second Law, both in equilibrium an in irreversible thermodynamics, the projection method of statistical mechanics, Boltzmann's \textit{H}-theorem or spin-echo experiment.

This publication has 0 references indexed in Scilit: