当前位置: X-MOL 学术Neural Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Robust Model of Gated Working Memory
Neural Computation ( IF 2.7 ) Pub Date : 2020-01-01 , DOI: 10.1162/neco_a_01249
Anthony Strock 1 , Xavier Hinaut 1 , Nicolas P Rougier 1
Affiliation  

Gated working memory is defined as the capacity of holding arbitrary information at any time in order to be used at a later time. Based on electrophysiological recordings, several computational models have tackled the problem using dedicated and explicit mechanisms. We propose instead to consider an implicit mechanism based on a random recurrent neural network. We introduce a robust yet simple reservoir model of gated working memory with instantaneous updates. The model is able to store an arbitrary real value at random time over an extended period of time. The dynamics of the model is a line attractor that learns to exploit reentry and a nonlinearity during the training phase using only a few representative values. A deeper study of the model shows that there is actually a large range of hyperparameters for which the results hold (e.g., number of neurons, sparsity, global weight scaling) such that any large enough population, mixing excitatory and inhibitory neurons, can quickly learn to realize such gated working memory. In a nutshell, with a minimal set of hypotheses, we show that we can have a robust model of working memory. This suggests this property could be an implicit property of any random population, that can be acquired through learning. Furthermore, considering working memory to be a physically open but functionally closed system, we give account on some counterintuitive electrophysiological recordings.

中文翻译:

门控工作记忆的鲁棒模型

门控工作记忆被定义为在任何时间保存任意信息以便以后使用的能力。基于电生理记录,几个计算模型已经使用专用和明确的机制解决了这个问题。我们建议考虑基于随机循环神经网络的隐式机制。我们引入了具有瞬时更新的门控工作记忆的强大而简单的储层模型。该模型能够在很长一段时间内随机存储任意实数值。该模型的动力学是一个线吸引子,它学习在训练阶段仅使用几个代表性值来利用折返和非线性。对该模型进行更深入的研究表明,结果实际上存在很大范围的超参数(例如,神经元数量、稀疏性、全局权重缩放),这样任何足够大的群体,混合兴奋性和抑制性神经元,都可以快速学会实现这种门控工作记忆。简而言之,通过最少的假设集,我们表明我们可以拥有一个强大的工作记忆模型。这表明该属性可能是任何随机群体的隐含属性,可以通过学习获得。此外,考虑到工作记忆是一个物理开放但功能封闭的系统,我们考虑了一些违反直觉的电生理记录。我们表明我们可以拥有一个强大的工作记忆模型。这表明该属性可能是任何随机群体的隐含属性,可以通过学习获得。此外,考虑到工作记忆是一个物理开放但功能封闭的系统,我们考虑了一些违反直觉的电生理记录。我们表明我们可以拥有一个强大的工作记忆模型。这表明该属性可能是任何随机群体的隐含属性,可以通过学习获得。此外,考虑到工作记忆是一个物理开放但功能封闭的系统,我们考虑了一些违反直觉的电生理记录。
更新日期:2020-01-01
down
wechat
bug