当前位置: X-MOL 学术Nat. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Deep learning incorporating biologically inspired neural dynamics and in-memory computing
Nature Machine Intelligence ( IF 23.8 ) Pub Date : 2020-06-15 , DOI: 10.1038/s42256-020-0187-0
Stanisław Woźniak , Angeliki Pantazi , Thomas Bohnstingl , Evangelos Eleftheriou

Spiking neural networks (SNNs) incorporating biologically plausible neurons hold great promise because of their unique temporal dynamics and energy efficiency. However, SNNs have developed separately from artificial neural networks (ANNs), limiting the impact of deep learning advances for SNNs. Here, we present an alternative perspective of the spiking neuron that incorporates its neural dynamics into a recurrent ANN unit called a spiking neural unit (SNU). SNUs may operate as SNNs, using a step function activation, or as ANNs, using continuous activations. We demonstrate the advantages of SNU dynamics through simulations on multiple tasks and obtain accuracies comparable to, or better than, those of ANNs. The SNU concept enables an efficient implementation with in-memory acceleration for both training and inference. We experimentally demonstrate its efficacy for a music-prediction task in an in-memory-based SNN accelerator prototype using 52,800 phase-change memory devices. Our results open up an avenue for broad adoption of biologically inspired neural dynamics in challenging applications and acceleration with neuromorphic hardware.



中文翻译:

深度学习结合了生物学启发的神经动力学和内存计算

掺入生物学上合理的神经元的尖峰神经网络(SNN)由于其独特的时间动态和能量效率而具有广阔的前景。但是,SNN已与人工神经网络(ANN)分开开发,从而限制了深度学习进步对SNN的影响。在这里,我们提出了尖峰神经元的另一种观点,该方法将其神经动力学结合到称为尖峰神经单元(SNU)的递归ANN单元中。使用步进功能激活,SNU可以用作SNN,使用连续激活,SNU可以用作ANN。我们通过对多个任务进行仿真来证明SNU动力学的优势,并获得与ANN相当或更好的精度。SNU概念可通过内存加速有效地实现训练和推理。我们通过实验证明了其在基于内存的SNN加速器原型中使用52,800个相变存储设备进行音乐预测任务的功效。我们的结果为在具有挑战性的应用程序中加速采用生物学启发的神经动力学和利用神经形态硬件加速提供了途径。

更新日期:2020-06-15
down
wechat
bug