当前位置: X-MOL 学术Front. Comput. Neurosci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Layer-skipping connections improve the effectiveness of equilibrium propagation on layered networks
Frontiers in Computational Neuroscience ( IF 2.1 ) Pub Date : 2021-03-30 , DOI: 10.3389/fncom.2021.627357
Jimmy Gammell 1, 2 , Sonia Buckley 1 , Sae Woo Nam 1 , Adam N McCaughan 1
Affiliation  

Equilibrium propagation is a learning framework that marks a step forward in the search for a biologically-plausible implementation of deep learning, and could be implemented efficiently in neuromorphic hardware. Previous applications of this framework to layered networks encountered a vanishing gradient problem that has not yet been solved in a simple, biologically-plausible way. In this paper, we demonstrate that the vanishing gradient problem can be mitigated by replacing some of a layered network's connections with random layer-skipping connections in a manner inspired by small-world networks. This approach would be convenient to implement in neuromorphic hardware, and is biologically-plausible.

中文翻译:

跳层连接提高了分层网络上均衡传播的有效性

平衡传播是一种学习框架,标志着在寻找生物学上合理的深度学习实现方面向前迈出了一步,并且可以在神经形态硬件中有效地实现。该框架之前在分层网络中的应用遇到了梯度消失问题,该问题尚未以简单的、生物学上合理的方式解决。在本文中,我们证明了可以通过以受小世界网络启发的方式用随机跳层连接替换分层网络的一些连接来缓解梯度消失问题。这种方法很容易在神经形态硬件中实现,并且在生物学上是合理的。
更新日期:2021-03-30
down
wechat
bug