Santa Fe Institute

SFI Working Paper Abstract

Title: Maximal Information Divergence from Statistical Models defined by Neural Networks
Author(s): Guido Montúfar, Johannes Rauh, Nihat Ay
Files: [pdf]
Paper #: 13-03-009
Date: March 13, 2013
Abstract: We review recent results about the maximal values of the Kullback-Leibler information divergence from statistical models defined by neural networks, including naïve Bayes models, restricted Boltzmann machines, deep belief networks, and various classes of exponential families. We illustrate approaches to compute the maximal divergence from a given model starting from simple sub- or super-models. We give a new result for deep and narrow belief networks with finite-valued units.
| Share |

Search Working Papers


Browse by Year