Information theory and inverse probability in telecommunication

Abstract
The foundations of information theory are presented as an extension of the theory of inverse probability. By postulating that information is additive and taking suitable averages, all the essential definitions of Shannon's theory for discrete and continuous communication channels, with and without noise, are obtained. The theory is based on the idea that receiving a communication, or making an observation, merely changes the relative probabilities of the various possible messages. The whole process of reception can therefore be regarded as a means of evaluating a posteriori probabilities, and this leads to the idea that the optimum receiver in any telecommunication problem can always be specified, in principle, by inverse probability. The simplest instance is the correlation receiver for detecting very weak signals in the presence of noise, and its theory is briefly discussed The paper concludes with an answer to possible criticisms of the use of inverse probability.

This publication has 0 references indexed in Scilit: