Quantifying and Maximizing the Information Flux in Recurrent Neural Networks

Metzner C, Yamakou M, Voelkl D, Schilling A, Krauß P (2024)


Publication Language: English

Publication Type: Journal article

Publication year: 2024

Journal

Book Volume: 36

Pages Range: 351-384

Journal Issue: 3

DOI: 10.1162/neco_a_01651

Abstract

Free-running recurrent neural networks (RNNs), especially probabilistic models, generate an ongoing information flux that can be quantified with the mutual information (Formula Presented) between subsequent system states (Formula Presented). Although previous studies have shown that I depends on the statistics of the network’s connection weights, it is unclear how to maximize I systematically and how to quantify the flux in large systems where computing the mutual information becomes intractable. Here, we address these questions using Boltzmann machines as model systems. We find that in networks with moderately strong connections, the mutual information I is approximately a monotonic transformation of the root-mean-square averaged Pearson correlations between neuron pairs, a quantity that can be efficiently computed even in large systems. Furthermore, evolutionary maximization of (Formula Presented) reveals a general design principle for theweight matrices enabling the systematic construction of systems with a high spontaneous information flux. Finally, we simultaneously maximize information flux and the mean period length of cyclic attractors in the state-space of these dynamical networks. Our results are potentially useful for the construction of RNNs that serve as short-time memories or pattern generators.

Authors with CRIS profile

How to cite

APA:

Metzner, C., Yamakou, M., Voelkl, D., Schilling, A., & Krauß, P. (2024). Quantifying and Maximizing the Information Flux in Recurrent Neural Networks. Neural Computation, 36(3), 351-384. https://doi.org/10.1162/neco_a_01651

MLA:

Metzner, Claus, et al. "Quantifying and Maximizing the Information Flux in Recurrent Neural Networks." Neural Computation 36.3 (2024): 351-384.

BibTeX: Download