Symmetrical overhead in flow networks

Abstract
Recent work by one of the authors has identified the average mutual information and the conditional entropy as two measures from information theory that are useful in quantifying the system organization and incoherence, respectively. While the scaled average mutual information, or network ascendency, is inherently symmetrical with respect to inputs and outputs, the scaled conditional entropy, or overhead, remains asymmetrical. Employing the joint entropy, instead of the conditional entropy, to characterize the overhead, results in a symmetrical overhead and also permits the decomposition of the system capacity, or complexity, into components useful in following the response of the whole system to perturbations.

This publication has 4 references indexed in Scilit: