Determinant Inequalities via Information Theory
- 1 July 1988
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Matrix Analysis and Applications
- Vol. 9 (3) , 384-392
- https://doi.org/10.1137/0609033
Abstract
Simple inequalities from information theory prove Hadamard’s inequality and some of its generalizations. It is also proven that the determinant of a positive definite matrix is log-concave and that the ratio of the determinant of the matrix to the determinant of its principal minor $ | K_n | / | K_n - 1 |$ is concave, establishing the concavity of minimum mean squared error in linear prediction. For Toeplitz matrices, the normalized determinant $| K_n |^{1/n} $ is shown to decrease with n.
Keywords
This publication has 12 references indexed in Scilit:
- Gaussian feedback capacityIEEE Transactions on Information Theory, 1989
- An information-theoretic proof of Burg's maximum entropy spectrumProceedings of the IEEE, 1984
- An information - theoretic proof of Hadamard's inequality (Corresp.)IEEE Transactions on Information Theory, 1983
- A Convexity Proof of Hadamard's InequalityThe American Mathematical Monthly, 1982
- The convolution inequality for entropy powersIEEE Transactions on Information Theory, 1965
- Some inequalities satisfied by the quantities of information of Fisher and ShannonInformation and Control, 1959
- On a Generalization of Hadamard's determinantal inequality due to SzászArchiv der Mathematik, 1957
- Some inequalities concerning positive-definite Hermitian matricesMathematical Proceedings of the Cambridge Philosophical Society, 1955
- On a Theorem of Weyl Concerning Eigenvalues of Linear TransformationsProceedings of the National Academy of Sciences, 1950
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948