Nix Barnett, James Crutchfield, Ryan James
Paper #: 16-01-001
A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose alternate measures for information flow. An auxiliary consequence reveals that the proliferation of networks as a now-common theoretical model for large-scale systems in concert with the use of transfer-like entropies has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems, despite the occurrence of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems goes undetected.