Entropy of natural languages: Theory and experiment
Open Access
- 31 May 1994
- journal article
- Published by Elsevier in Chaos, Solitons, and Fractals
- Vol. 4 (5) , 709-743
- https://doi.org/10.1016/0960-0779(94)90079-5
Abstract
No abstract availableKeywords
This publication has 8 references indexed in Scilit:
- Word frequency and entropy of symbolic sequences: a dynamical perspectiveChaos, Solitons, and Fractals, 1992
- Der bekannte Grenzwert der redundanzfreien Information in Texten - eine Fehlinterpretation der Shannonschen Experimente?Frequenz, 1990
- Estimating the information content of symbol sequences and efficient codesIEEE Transactions on Information Theory, 1989
- A convergent gambling estimate of the entropy of EnglishIEEE Transactions on Information Theory, 1978
- On a Statistical Estimate for the Entropy of a Sequence of Independent Random VariablesTheory of Probability and Its Applications, 1959
- Statistical calculation of word entropies for four Western languagesIEEE Transactions on Information Theory, 1955
- Prediction and Entropy of Printed EnglishBell System Technical Journal, 1951
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948