Simple proof of the concavity of the entropy power with respect to added Gaussian noise
- 1 July 1989
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 35 (4) , 887-888
- https://doi.org/10.1109/18.32166
Abstract
A very simple proof of M.H. Costa's result (see ibid., vol.IT-31, p.751-60, 1985) that the entropy power of Xt=X +N(O,tI) is concave in t, is derived as an immediate consequence of an inequality concerning Fisher information. This relationship between Fisher information and entropy is found to be useful for proving the central limit theorem. Thus, one who seeks new entropy inequalities should try first to find new equalities about Fisher information, or at least to exploit the existing ones in new waysKeywords
This publication has 7 references indexed in Scilit:
- Entropy and the Central Limit TheoremThe Annals of Probability, 1986
- A new entropy power inequalityIEEE Transactions on Information Theory, 1985
- On the Gaussian interference channelIEEE Transactions on Information Theory, 1985
- The information theoretic proof of Kac's theoremProceedings of the Japan Academy, Series A, Mathematical Sciences, 1970
- The convolution inequality for entropy powersIEEE Transactions on Information Theory, 1965
- Some inequalities satisfied by the quantities of information of Fisher and ShannonInformation and Control, 1959
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948