February 07, 2013
Collins Conference Room
Eckehard Olbrich (Max Planck Institute for Mathematics in the Sciences)
Abstract. How can the information that a set of random variables contains about another random variable be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information?
Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures.
We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions.
Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge.
We conclude that intuition and heuristic arguments do not suffice when arguing about information. We expect that further progress requires a more precise, operational idea of what shared information should be.
Nils Bertschinger, Johannes Rauh, Eckehard Olbrich, and Jürgen Jost. "Shared Information — New Insights and Problems in Decomposing Information in Complex Systems." arXiv:1210.5902 (2012)
Williams, P., and R. Beer. "Nonnegative decomposition of multivariate information." arXiv:1004.2515v1 (2010)
Purpose: Research Collaboration
SFI Host: David Wolpert
During an October 18 SFI Community Lecture in Santa Fe, mathematician Jordan Ellenberg explored how math can help us think about the seemingly uncertain matters that dominate our lives. Watch ...
In two lectures, Seth Lloyd explores what happens when one system gains an advantage in collecting and processing information – an advantage he believes underlies all creation and destruction in our ...
A team of ecologists met at SFI recently to begin synthesizing an efficient theory that aims toward a more unified understanding of ecology.
During an SFI Community Lecture in Santa Fe, Rosalind Picard reveals some of the surprises she has discovered at the intersection of human emotion and wearable tech. Watch her talk ...
Drawing from network science, decision-making tools with artificial intelligence, and social influence theories, experts gathered at SFI recently to explore new ways to spark large-scale social change.