当前位置: X-MOL 学术arXiv.cs.AR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Bit Error Robustness for Energy-Efficient DNN Accelerators
arXiv - CS - Hardware Architecture Pub Date : 2020-06-24 , DOI: arxiv-2006.13977
David Stutz, Nandhini Chandramoorthy, Matthias Hein, Bernt Schiele

Deep neural network (DNN) accelerators received considerable attention in past years due to saved energy compared to mainstream hardware. Low-voltage operation of DNN accelerators allows to further reduce energy consumption significantly, however, causes bit-level failures in the memory storing the quantized DNN weights. In this paper, we show that a combination of robust fixed-point quantization, weight clipping, and random bit error training (RandBET) improves robustness against random bit errors in (quantized) DNN weights significantly. This leads to high energy savings from both low-voltage operation as well as low-precision quantization. Our approach generalizes across operating voltages and accelerators, as demonstrated on bit errors from profiled SRAM arrays. We also discuss why weight clipping alone is already a quite effective way to achieve robustness against bit errors. Moreover, we specifically discuss the involved trade-offs regarding accuracy, robustness and precision: Without losing more than 1% in accuracy compared to a normally trained 8-bit DNN, we can reduce energy consumption on CIFAR-10 by 20%. Higher energy savings of, e.g., 30%, are possible at the cost of 2.5% accuracy, even for 4-bit DNNs.

中文翻译:

高能效 DNN 加速器的误码稳健性

与主流硬件相比,深度神经网络 (DNN) 加速器由于节省能源而在过去几年中受到了广泛关注。DNN 加速器的低电压运行允许进一步显着降低能耗,但是,会导致存储量化 DNN 权重的内存出现位级故障。在本文中,我们展示了稳健的定点量化、权重裁剪和随机误码训练 (RandBET) 的组合显着提高了对(量化的)DNN 权重中随机误码的鲁棒性。这导致低电压操作和低精度量化的高能量节省。我们的方法适用于所有工作电压和加速器,正如分析的 SRAM 阵列的位错误所证明的那样。我们还讨论了为什么单独的权重裁剪已经是实现对误码的鲁棒性的一种非常有效的方法。此外,我们专门讨论了有关准确性、鲁棒性和精度的权衡:与正常训练的 8 位 DNN 相比,准确性损失不超过 1%,我们可以将 CIFAR-10 的能耗降低 20%。即使对于 4 位 DNN,也可以以 2.5% 的准确度为代价节省更高的能源,例如 30%。
更新日期:2020-10-21
down
wechat
bug