Information gain within nonextensive thermostatistics

Abstract
We discuss the information theoretical foundations of the Kullback information gain, recently generalized within a nonextensive thermostatistical formalism. General properties are studied and, in particular, a consistent test for measuring the degree of correlation between random variables is proposed. In addition, minimum entropy distributions are discussed and the H-theorem is proved within the generalized context.

This publication has 49 references indexed in Scilit: