Cumulative Residual Entropy: A New Measure of Information
Top Cited Papers
- 1 June 2004
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 50 (6) , 1220-1228
- https://doi.org/10.1109/tit.2004.828057
Abstract
In this paper, we use the cumulative distribution of a random variable to define its information content and thereby develop an alternative measure of uncertainty that extends Shannon entropy to random variables with continuous distributions. We call this measure cumulative residual entropy (CRE). The salient features of CRE are as follows: 1) it is more general than the Shannon entropy in that its definition is valid in the continuous and discrete domains, 2) it possesses more general mathematical properties than the Shannon entropy, and 3) it can be easily computed from sample data and these computations asymptotically converge to the true values. The properties of CRE and a precise formula relating CRE and Shannon entropy are given in the paper. Finally, we present some applications of CRE to reliability engineering and computer vision.Keywords
This publication has 8 references indexed in Scilit:
- Image registration methods: a surveyImage and Vision Computing, 2003
- A New & Robust Information Theoretic Measure and Its Application to Image AlignmentPublished by Springer Nature ,2003
- Alignment by maximization of mutual informationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Elements of Information TheoryPublished by Wiley ,2001
- An overlap invariant entropy measure of 3D medical image alignmentPattern Recognition, 1999
- Data CompressionPublished by Springer Nature ,1998
- Nonlinear ProgrammingJournal of the Operational Research Society, 1997
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948