当前位置: X-MOL 学术IEEE Signal Process. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
ATA: Attentional Non-Linear Activation Function Approximation for VLSI-Based Neural Networks
IEEE Signal Processing Letters ( IF 3.2 ) Pub Date : 2021-03-18 , DOI: 10.1109/lsp.2021.3067188
Linyu Wei , Jueping Cai , Wuzhuang Wang

In this letter, we present an attentional non-linear activation function approximation method called ATA for VLSI-based neural networks. Unlike other approximation methods that pursue the low hardware resources with a high recognition accuracy loss, the ATA utilizes the pixel attention to focus on the important features to keep the recognition accuracy and reduce resource cost. Specifically, attention applied in the activation function is realized by the approximated activation functions with different fitting errors for VLSI-based neural networks. The important features are highlighted by the piecewise linear function and improved look-up table with low fitting error, while the trivial features are ignored with the large fitting error. Experimental results demonstrate that the ATA outperforms other state-of-the-art approximation methods in recognition accuracy, power and area.

中文翻译:

ATA:基于VLSI的神经网络的注意非线性激活函数逼近

在这封信中,我们为基于VLSI的神经网络提出了一种称为ATA的注意非线性激活函数逼近方法。与其他追求低硬件资源且识别精度损失较高的近似方法不同,ATA利用像素的注意力集中在重要特征上,以保持识别精度并降低资源成本。具体地,对于基于VLSI的神经网络,通过具有不同拟合误差的近似激活函数来实现对激活函数的关注。分段线性函数和改进的查找表以较低的拟合误差突出显示了重要特征,而在拟合误差较大的情况下忽略了琐碎的特征。
更新日期:2021-05-04
down
wechat
bug