Abstract
The paper mentions first some essential points about the early development of languages, codes and symbolism, picking out those fundamental points in human communication which have recently been summarized by precise mathematical theory. A survey of telegraphy and telephony development leads to the need for “economy,” which has given rise to various systems of signal compression. Hartley's early theory of communication is summarized, and Gabor's theory of signal structure is described.Modern statistical theory of Wiener and Shannon, by which “information” may be expressed quantitatively, is shown to be a logical extension of Hartley's work. A Section on calculating machines and brains attempts to clear up popular misunderstandings and to separate definite accomplishments in mechanization of thought processes from mere myths.Finally, a generalization of the work over the whole field of scientific observation is shown, and evidence which supports the view that “information plus entropy is an important invariant of a physical system” is included.

This publication has 0 references indexed in Scilit: