当前位置: X-MOL 学术Neural Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Low-Dimensional Manifolds Support Multiplexed Integrations in Recurrent Neural Networks
Neural Computation ( IF 2.7 ) Pub Date : 2021-01-29 , DOI: 10.1162/neco_a_01366
Arnaud Fanthomme 1 , Rémi Monasson 1
Affiliation  

We study the learning dynamics and the representations emerging in recurrent neural networks (RNNs) trained to integrate one or multiple temporal signals. Combining analytical and numerical investigations, we characterize the conditions under which an RNN with n neurons learns to integrate D(n) scalar signals of arbitrary duration. We show, for linear, ReLU, and sigmoidal neurons, that the internal state lives close to a D-dimensional manifold, whose shape is related to the activation function. Each neuron therefore carries, to various degrees, information about the value of all integrals. We discuss the deep analogy between our results and the concept of mixed selectivity forged by computational neuroscientists to interpret cortical recordings.



中文翻译:

低维流形支持循环神经网络中的多路集成

我们研究了经过训练以整合一个或多个时间信号的循环神经网络 (RNN) 中出现的学习动态和表示。结合分析和数值研究,我们描述了 RNN 与n 神经元学习整合 D(n)任意持续时间的标量信号。我们表明,对于线性、ReLU 和 sigmoidal 神经元,内部状态接近于D维流形,其形状与激活函数有关。因此,每个神经元都在不同程度上携带有关所有积分值的信息。我们讨论了我们的结果与计算神经科学家为解释皮层记录而伪造的混合选择性概念之间的深刻类比。

更新日期:2021-01-31
down
wechat
bug