Abstract
The main purpose of the paper is to stimulate thought concerning terminology, notation and exposition of some basic parts of information theory. The notation used here is intended to be simple, nearly self-explanatory, readable in words from left to right, and suggestive of new applications. Moreover, sufficient generality is preserved ensure that entropy can be interpreted without necessarily depending on the frequentist definition of probability (as a limiting frequency an infinite sequence of trials). Mention is made of some connections of the theory with inverse probability and with mathematical statistics in general.

This publication has 1 reference indexed in Scilit: