当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Online Memorization of Random Firing Sequences by a Recurrent Neural Network
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-01-09 , DOI: arxiv-2001.02920
Patrick Murer and Hans-Andrea Loeliger

This paper studies the capability of a recurrent neural network model to memorize random dynamical firing patterns by a simple local learning rule. Two modes of learning/memorization are considered: The first mode is strictly online, with a single pass through the data, while the second mode uses multiple passes through the data. In both modes, the learning is strictly local (quasi-Hebbian): At any given time step, only the weights between the neurons firing (or supposed to be firing) at the previous time step and those firing (or supposed to be firing) at the present time step are modified. The main result of the paper is an upper bound on the probability that the single-pass memorization is not perfect. It follows that the memorization capacity in this mode asymptotically scales like that of the classical Hopfield model (which, in contrast, memorizes static patterns). However, multiple-rounds memorization is shown to achieve a higher capacity (with a nonvanishing number of bits per connection/synapse). These mathematical findings may be helpful for understanding the functions of short-term memory and long-term memory in neuroscience.

中文翻译:

通过循环神经网络在线记忆随机触发序列

本文研究了循环神经网络模型通过简单的局部学习规则记忆随机动态激发模式的能力。考虑了两种学习/记忆模式:第一种模式是严格在线的,单次遍历数据,而第二种模式使用多次遍历数据。在这两种模式下,学习都是严格局部的(准 Hebbian):在任何给定的时间步长,只有在前一时间步长发射(或应该发射)的神经元与那些发射(或应该发射)的神经元之间的权重在当前时间步被修改。该论文的主要结果是单遍记忆不完美的概率的上限。因此,这种模式下的记忆能力与经典 Hopfield 模型(相比之下,记住静态模式)。然而,多轮记忆被证明可以实现更高的容量(每个连接/突触的位数不为零)。这些数学发现可能有助于理解神经科学中短期记忆和长期记忆的功能。
更新日期:2020-01-10
down
wechat
bug