Nihat Ay, Thomas Wennekers

Paper #: 04-01-001

We extend Linkser's Infomax principle for feedforward neural networks to a measure for stochastic interdependence that captures spatial and temporal signal properties in recurrent systems. This measure – “stochastic interaction” – quantifies the Kullback-Leibler divergence of a Markov chain from a product of split chains for the single unit processes. For unconstrained Markov chains, the maximization of stochastic interaction, also called “Temporal Infomax,” has previously been shown to result in almost deterministic dynamics. The present work considers Temporal Infomax on constrained Markov chains, where part of the units are clamped to prescribed stochastic processes providing external input to the system. Surprisingly, Temporal Infomax in that case leads to finite state automata, either completely deterministic or at most weakly non-deterministic. Transitions between internal states of these systems are almost perfectly predictable, given the complete current state and the input, but the activity of each single unit alone is virtually random. The results are demonstrated by means of computer simulations and confirmed analytically. We furthermore relate them to experimental data concerning the correlation dynamics and functional connectivities observed in multiple electrode recordings.

PDF