Abstract
The principle of maximum entropy for assignment of estimated probabilities is intriguing. Since entropy is the negative of information, the principle was renamed. The formalisms for discrete and continuous random variables are described and illustrated. The problem of complete ignorance is discussed and the concept of quasi-ignorance is introduced as a substitute. Constraints on the probabilities, beyond the minimum, are a source of consternation and a stumbling block to the application of the principle. Many questions are raised, but no answers are given. At present, the principle cannot be evoked to solve practical problems in reliability. Theoreticians should apply themselves to find realistic tractable constraints that do not involve logical contradictions.

This publication has 1 reference indexed in Scilit: