In the present study, we aim to understand the properties that a recently formulated complexity measure reveals in relation to stochastic networks in general. This new measure, which quantifies the notion of total information flow in a network of stochastically interacting units, can be thought of as a useful extension to the commonly used Kullback-Leibler divergence in understanding the interdependencies of the network units. Specifically, this complexity measure of the network “integration” additionally captures the intrinsically temporal aspects of interaction between units. Our approach consists of understanding, through both analytical and numerical calculations, this complexity measure in increasingly sophisticated networks—we will first start with fully recurrent neural networks with sigmoidal activation functions, then move on to Hopfield networks, and then to dynamically learning multilayer neural networks.
Noyce Conference Room
This event is by invitation only.