当前位置: X-MOL 学术Cognit. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Weight Importance Analysis Technique for Area- and Power-Efficient Binary Weight Neural Network Processor Design
Cognitive Computation ( IF 5.4 ) Pub Date : 2021-01-04 , DOI: 10.1007/s12559-020-09794-6
Yin Wang , Yuxiang Xie , Jiayan Gan , Liang Chang , Chunbo Luo , Jun Zhou

Recently, the binary weight neural network (BWNN) processor design has attracted lots of attention due to its low computational complexity and memory demands. For the design of BWNN processor, emerging memory technologies such as RRAM can be used to replace conventional SRAM to save area and accessing power. However, RRAM is prone to bit errors, leading to reduced classification accuracy. To combine BWNN and RRAM to reduce the area overhead and power consumption while maintaining a high classification accuracy is a significant research challenge. In this work, we propose an automatic weight importance analysis technique and a mixed weight storage scheme to address the above-mentioned issue. For demonstration, we applied the proposed techniques to two typical BWNNs. The experimental results show that more than 78% (40%) area saving and 57% (30%) power saving can be achieved with less than 1% accuracy loss. The proposed techniques are applicable in resource- and power-constrained neural network processor design and show significant potentials for AI-based Internet-of-Things (IoT) devices that usually have low computational and storage resources.



中文翻译:

高效省电的二进制加权神经网络处理器设计的重要性分析技术

最近,二进制加权神经网络(BWNN)处理器设计由于其低计算复杂性和内存需求而引起了广泛的关注。对于BWNN处理器的设计,可以使用新兴的内存技术(例如RRAM)来代替传统的SRAM,以节省面积和访问功率。但是,RRAM容易出现误码,导致分类精度降低。结合BWNN和RRAM来减少面积开销和功耗,同时保持较高的分类精度是一项重大的研究挑战。在这项工作中,我们提出了一种自动权重重要性分析技术和一种混合权重存储方案来解决上述问题。为了演示,我们将建议的技术应用于两个典型的BWNN。实验结果表明,可以节省超过78%(40%)的面积和57%(30%)的功耗,而精度损失不到1%。所提出的技术适用于资源和功率受限的神经网络处理器设计,并显示了通常基于计算和存储资源较少的基于AI的物联网(IoT)设备的巨大潜力。

更新日期:2021-01-05
down
wechat
bug