当前位置: X-MOL 学术Int. J. Intell. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Recurrent spiking neural network with dynamic presynaptic currents based on backpropagation
International Journal of Intelligent Systems ( IF 5.0 ) Pub Date : 2021-12-01 , DOI: 10.1002/int.22772
Zijian Wang 1 , Yanting Zhang 1 , Haibo Shi 2 , Lei Cao 3 , Cairong Yan 1 , Guangwei Xu 1
Affiliation  

In recent years, spiking neural networks (SNNs), which originated from the theoretical basis of neuroscience, have attracted neuromorphic computing and brain-like computing due to their advantages, such as neural dynamics and coding mechanism, which are similar to biological neurons. SNNs have become one of the mainstream frameworks in the field of brain-like computing. However, most of the Leaky Integrate-and-Fire (LIF) neuron models currently used by SNNs based on direct training of backpropagation (BP) do not consider the changes in the recurrent connections and the dynamic strength of neuron connections over time. This study presented the LIF neuron model with recurrent connections and a method for dynamically changing the presynaptic currents. Recurrent LIF neurons have an additional cyclic connection compared with classic LIF neurons. Their postsynaptic current stimulates a change in membrane potential at the next time point. Their dynamics were more similar to the activities of biological neurons. We also proposed an efficient and flexible BP training method for recurrent LIF neurons. On the basis of the above methods, we proposed the recurrent SNN with dynamic presynaptic currents based on backpropagation (RDS-BP). We test the proposed RDS-BP on three image data sets (MNIST, Fashion-MNIST and CIFAR-10) and two text data sets (IMDB and TREC). The results showed that the performance of RDS-BP not only exceeded the naive SNN models based on BP but also exceeded the SNN methods proposed in previous studies in recent years, which had excellent performance in previous experiments. Our work provides a new LIF neuron model with a recurrent connection and dynamic presynaptic current and a BP training arrangement for the proposed neuron, which could merit developments with neuromorphic and brain-like computing.

中文翻译:

基于反向传播的具有动态突触前电流的递归脉冲神经网络

近年来,起源于神经科学理论基础的脉冲神经网络(spiking neural networks,SNNs)由于其类似于生物神经元的神经动力学和编码机制等优势,吸引了神经形态计算和类脑计算。SNN已经成为类脑计算领域的主流框架之一。然而,目前 SNN 使用的大多数基于反向传播 (BP) 直接训练的 Leaky Integrate-and-Fire (LIF) 神经元模型没有考虑循环连接的变化和神经元连接的动态强度随时间的变化。本研究提出了具有循环连接的 LIF 神经元模型和一种动态改变突触前电流的方法。与经典 LIF 神经元相比,循环 LIF 神经元具有额外的循环连接。它们的突触后电流会在下一个时间点刺激膜电位的变化。它们的动力学更类似于生物神经元的活动。我们还提出了一种用于循环 LIF 神经元的高效灵活的 BP 训练方法。在上述方法的基础上,我们提出了基于反向传播(RDS-BP)的具有动态突触前电流的循环 SNN。我们在三个图像数据集(MNIST、Fashion-MNIST 和 CIFAR-10)和两个文本数据集(IMDB 和 TREC)上测试了提议的 RDS-BP。结果表明,RDS-BP 的性能不仅超过了基于 BP 的朴素 SNN 模型,而且也超过了近年来研究提出的 SNN 方法,这些方法在之前的实验中表现出色。
更新日期:2021-12-01
down
wechat
bug