A Comparative Assessment of Various Measures of Entropy
- 1 January 1983
- journal article
- research article
- Published by Taru Publications in Journal of Information and Optimization Sciences
- Vol. 4 (3) , 207-232
- https://doi.org/10.1080/02522667.1983.10698762
Abstract
A large number of measures of entropy have been proposed by Hartley [21], Shannon [46], Renyi [43], Havrada and Charvat [22], Aczel and Daroczy [5], Kapur [25, 26,27], Rathie [42], Behara and Chawla [11], Sharma and Taneja [47] and others. These do not measure the same entity. Moreover the definitions of the various measures have been motivated by quite different considerations. The use of the same word ‘entropy’ for so many Intrinsically different entities is confusing and unfortunate. The present paper attempts to explain each of the entropies in its proper perspective by making a comparative assessment of the various measures proposed.Keywords
This publication has 19 references indexed in Scilit:
- A mixed theory of information. V. How to keep the (inset) expert honestJournal of Mathematical Analysis and Applications, 1980
- A mixed theory of information. III. Inset entropies of degree βInformation and Control, 1978
- Why the Shannon and Hartley entropies are ‘natural’Advances in Applied Probability, 1974
- The Theory of Stochastic Preference and Brand SwitchingJournal of Marketing Research, 1974
- Weighted entropyReports on Mathematical Physics, 1971
- On the Use of Markov Chains in Movement ResearchEconomic Geography, 1970
- A hundred years of entropyPhysics Today, 1968
- Charakterisierung der Entropien positiver Ordnung und der shannonschen EntropieActa Mathematica Hungarica, 1963
- On a Functional EquationEdinburgh Mathematical Notes, 1960
- Transmission of Information1Bell System Technical Journal, 1928