Abstract
In neural circuits, synaptic strengths influence neuronal activity by shaping network dynamics, and neuronal activity influences synaptic strengths through activity-dependent plasticity. Motivated by this fact, we study a recurrent-network model in which neuronal units and synaptic couplings are interacting dynamic variables, with couplings subject to Hebbian modification with decay around quenched random strengths. Rather than assigning a specific role to the plasticity, we use dynamical mean-field theory and other techniques to systematically characterize the neuronal-synaptic dynamics, revealing a rich phase diagram. Adding Hebbian plasticity slows activity in already chaotic networks and can induce chaos in otherwise quiescent networks. Anti-Hebbian plasticity quickens activity and produces an oscillatory component. Analysis of the Jacobian shows that Hebbian and anti-Hebbian plasticity push locally unstable modes toward the real and imaginary axes, respectively, explaining these behaviors. Both random-matrix and Lyapunov analysis show that strong Hebbian plasticity segregates network timescales into two bands, with a slow, synapse-dominated band driving the dynamics, suggesting a flipped view of the network as synapses connected by neurons. For increasing strength, Hebbian plasticity initially raises the complexity of the dynamics, measured by the maximum Lyapunov exponent and attractor dimension, but then decreases these metrics, likely due to the proliferation of stable fixed points. We compute the marginally stable spectra of such fixed points as well as their number, showing exponential growth with network size. Finally, in chaotic states with strong Hebbian plasticity, a stable fixed point of neuronal dynamics is destabilized by synaptic dynamics, allowing any neuronal state to be stored as a stable fixed point by halting the plasticity. This phase of freezable chaos offers a new mechanism for working memory.
2 More- Received 9 February 2023
- Revised 8 October 2023
- Accepted 9 January 2024
DOI:https://doi.org/10.1103/PhysRevX.14.021001
Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.
Published by the American Physical Society
Physics Subject Headings (PhySH)
Viewpoint
The Neuron vs the Synapse: Which One Is in the Driving Seat?
Published 1 April 2024
A new theoretical framework for plastic neural networks predicts dynamical regimes where synapses rather than neurons primarily drive the network’s behavior, leading to an alternative candidate mechanism for working memory in the brain.
See more in Physics
Popular Summary
Neural circuits typically involve many neurons communicating through synaptic connections. Tools from the physics of disordered systems have revealed how the dynamics of these networks emerges from the structure of fixed synaptic connections. However, connections between neurons are themselves dynamic; the strength of a synapse can change due to the activities of the two neurons that it connects. Thus, neurons and synapses are coupled in an intricate dance: Synapses influence neurons by shaping network dynamics, and neurons influence synapses through activity-dependent plasticity. Understanding this interplay is a major unsolved problem in theoretical neuroscience. To that end, we develop a theory describing the coupled neuronal-synaptic dynamics, thereby providing a foundation for understanding how these dynamics could underlie computation.
In this paper, we analyze a mathematical model of a neural network in which neurons and synapses are mutually coupled dynamic variables. This in turn allows us to develop our theory in the limit in which the network size is large. We find that this system exhibits a variety of new dynamic phases and unexpected phenomena. Most notably, we discover an effect that we call “freezable chaos,” which allows the state of the network to be frozen in time by abruptly disabling synaptic plasticity.
Our theory provides a basis for future theoretical and experimental studies elucidating the role of coupled neuronal-synaptic dynamics in cognition, sensation, and behavior. Several studies have proposed that machine-learning systems become more powerful when their parameters exhibit dynamics analogous to synaptic plasticity, and our theory sheds light on the functioning of these systems as well.