Abstract
Summary: Some mathematical remarks on information theryThis paper is the eleboration of a lecture given by the author at the “Statistische Dag 1960”. It contains the definitions of entropy and amount of information as given by Shannon, some remarks on their interpretation, and their generalizations to abstract probability spaces. Some remarks are made on the mathematical problems arising from these definitions.Shannon's definition is compared with the concept “instrinsic accuracy” or information used in statistics that was defined by Fisher. Some properties these different kinds of “information” have in common are mentioned.

This publication has 10 references indexed in Scilit: