Abstract: In the first part of my talk, I will review information-geometric structures and highlight the important role of divergences. I will present a novel approach to canonical divergences which extends the classical definition and recovers, in particular, the well-known Kullback-Leibler divergence and its relation to the Fisher-Rao metric and the Amari-Chentsov tensor.
Divergences also play an important role within a geometric approach to complexity. This approach is based on the general understanding that the complexity of a system can be quantified as the extent to which it is more than the sum of its parts. In the second part of my talk, I will motivate this approach and review corresponding work.
1. N. Ay, S.I. Amari. A Novel Approach to Canonical Divergences within Information Geometry. Entropy (2015) 17: 8111-8129. doi:10.3390/e17127866.
2. N. Ay, J. Jost, H. V. Le, L. Schwachhöfer. Information geometry and sufficient statistics. Probability Theory and Related Fields (2015) 162: 327-364. doi: 10.1007/s00440-014-0574-8.
3. N. Ay, J. Jost, H. V. Le, L. Schwachhöfer. Parametrized measure models. Bernoulli (2016) accepted. arXiv:1510.07305.
4. N. Ay, J. Jost, H. V. Le, L. Schwachhöfer. Information geometry. Ergebnisse der Mathematik und Ihrer Grenzgebiete/A Series of Modern Surveys in Mathematics, Springer 2017, forthcoming book.
5. N. Ay. Information Geometry on Complexity and Stochastic Interaction. Entropy (2015) 17(4): 2432-2458. doi: 10.3390/e17042432.