Mutual Information and Minimum Mean-Square Error in Gaussian Channels
Top Cited Papers
- 4 April 2005
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 51 (4) , 1261-1282
- https://doi.org/10.1109/tit.2005.844072
Abstract
This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and SNR.Keywords
All Related Versions
This publication has 44 references indexed in Scilit:
- On Mutual Information, Likelihood Ratios, and Estimation Error for the Additive Gaussian ChannelIEEE Transactions on Information Theory, 2005
- Randomly Spread CDMA: Asymptotics Via Statistical PhysicsIEEE Transactions on Information Theory, 2005
- A new entropy power inequalityIEEE Transactions on Information Theory, 1985
- Estimation of noisy telegraph processes: Nonlinear filtering versus nonlinear smoothing (Corresp.)IEEE Transactions on Information Theory, 1985
- Information and filteringInformation Sciences, 1979
- Lower and upper bounds on the optimal filtering error of certain diffusion processesIEEE Transactions on Information Theory, 1972
- Capacity of a continuous memoryless channel with feedbackIEEE Transactions on Information Theory, 1971
- Mutual information of the white Gaussian channel with and without feedbackIEEE Transactions on Information Theory, 1971
- A further note on a general likelihood formula for random signals in Gaussian noiseIEEE Transactions on Information Theory, 1970
- A general likelihood-ratio formula for random signals in Gaussian noiseIEEE Transactions on Information Theory, 1969