当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning in the machine: To share or not to share?
Neural Networks ( IF 7.8 ) Pub Date : 2020-03-25 , DOI: 10.1016/j.neunet.2020.03.016
Jordan Ott 1 , Erik Linstead 2 , Nicholas LaHaye 2 , Pierre Baldi 3
Affiliation  

Weight-sharing is one of the pillars behind Convolutional Neural Networks and their successes. However, in physical neural systems such as the brain, weight-sharing is implausible. This discrepancy raises the fundamental question of whether weight-sharing is necessary. If so, to which degree of precision? If not, what are the alternatives? The goal of this study is to investigate these questions, primarily through simulations where the weight-sharing assumption is relaxed. Taking inspiration from neural circuitry, we explore the use of Free Convolutional Networks and neurons with variable connection patterns. Using Free Convolutional Networks, we show that while weight-sharing is a pragmatic optimization approach, it is not a necessity in computer vision applications. Furthermore, Free Convolutional Networks match the performance observed in standard architectures when trained using properly translated data (akin to video). Under the assumption of translationally augmented data, Free Convolutional Networks learn translationally invariant representations that yield an approximate form of weight-sharing.

中文翻译:

在机器上学习:共享还是不共享?

重量共享是卷积神经网络及其成功背后的支柱之一。但是,在诸如大脑的物理神经系统中,难以实现重量共享。这种差异提出了一个基本问题,即是否有必要分享重量。如果是这样,精确到什么程度?如果没有,有哪些替代方案?这项研究的目的是主要通过放松权重分配假设的模拟研究这些问题。从神经电路中汲取灵感,我们探索了自由卷积网络和具有可变连接模式的神经元的使用。使用自由卷积网络,我们表明虽然权重共享是一种务实的优化方法,但在计算机视觉应用中并不是必需的。此外,当使用正确翻译的数据(类似于视频)进行训练时,自由卷积网络可与标准体系结构中观察到的性能相匹配。在平移增强数据的假设下,自由卷积网络学习平移不变的表示形式,从而得出权重共享的近似形式。
更新日期:2020-03-26
down
wechat
bug