Abstract
The structure of n-dimensional information theory is well adapted to the study of complex systems of many parts interacting in a nonsimple way, since it allows quantifications of the degree to which the parts are interdependent, i.e., are in communication with each other. It is particularly well suited to hierarchical systems. Even when calculations are impossible, informal interpretations of informational equations shed interesting light on the behavior of systems. These informal interpretations are emphasized. It is shown that the requirements on a system for selection of appropriate information (and therefore blockage of irrelevant information), internal coordination of parts, and throughput are essentially additive and therefore compete for the computational resources of the system. This observation has implications for system architecture. It is shown that under certain assumptions (valid, for example, for networks of parallel processors) systems are constraintlosing as well as information-losing. The importance and usefulness of the loss of information by a system is discussed briefly.

This publication has 7 references indexed in Scilit: