当前位置: X-MOL 学术Front. Neurosci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Neuromorphic Systems Design by Matching Inductive Biases to Hardware Constraints
Frontiers in Neuroscience ( IF 3.2 ) Pub Date : 2020-05-28 , DOI: 10.3389/fnins.2020.00437
Lorenz K Muller 1 , Pascal Stark 1 , Bert Jan Offrein 1 , Stefan Abel 1
Affiliation  

Neuromorphic systems are designed with careful consideration of the physical properties of the computational substrate they use. Neuromorphic engineers often exploit physical phenomena to directly implement a desired functionality, enabled by “the isomorphism between physical processes in different media” (Douglas et al., 1995). This bottom-up design methodology could be described as matching computational primitives to physical phenomena. In this paper, we propose a top-down counterpart to the bottom-up approach to neuromorphic design. Our top-down approach, termed “bias matching,” is to match the inductive biases required in a learning system to the hardware constraints of its implementation; a well-known example is enforcing translation equivariance in a neural network by tying weights (replacing vector-matrix multiplications with convolutions), which reduces memory requirements. We give numerous examples from the literature and explain how they can be understood from this perspective. Furthermore, we propose novel network designs based on this approach in the context of collaborative filtering. Our simulation results underline our central conclusions: additional hardware constraints can improve the predictions of a Machine Learning system, and understanding the inductive biases that underlie these performance gains can be useful in finding applications for a given constraint.

中文翻译:

通过将归纳偏差与硬件约束相匹配的神经形态系统设计

神经形态系统的设计仔细考虑了它们使用的计算基板的物理特性。神经拟态工程师经常利用物理现象来直接实现所需的功能,这是通过“不同媒体中物理过程之间的同构”实现的(Douglas 等,1995)。这种自下而上的设计方法可以描述为将计算原语与物理现象相匹配。在本文中,我们提出了自下而上的神经形态设计方法的自上而下对应。我们自上而下的方法,称为“偏差匹配”,是将学习系统所需的归纳偏差与其实现的硬件约束相匹配;一个众所周知的例子是通过绑定权重(用卷积代替向量矩阵乘法)在神经网络中强制执行平移等方差,这减少了内存需求。我们从文献中给出了许多例子,并解释了如何从这个角度理解它们。此外,我们在协同过滤的背景下提出了基于这种方法的新颖网络设计。我们的模拟结果强调了我们的核心结论:额外的硬件约束可以改进机器学习系统的预测,并且理解作为这些性能提升基础的归纳偏差对于寻找给定约束的应用非常有用。
更新日期:2020-05-28
down
wechat
bug