当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Controlling information capacity of binary neural network
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2020-07-25 , DOI: 10.1016/j.patrec.2020.07.033
Dmitry Ignatov , Andrey Ignatov

Despite the growing popularity of deep learning technologies, high memory requirements and power consumption are essentially limiting their application in mobile and IoT areas. While binary convolutional networks can alleviate these problems, the limited bitwidth of weights is often leading to significant degradation of prediction accuracy. In this paper, we present a method for training binary networks that maintains a stable predefined level of their information capacity throughout the training process by applying Shannon entropy based penalty to convolutional filters. The results of experiments conducted on the SVHN, CIFAR and ImageNet datasets demonstrate that the proposed approach can statistically significantly improve the accuracy of binary networks.



中文翻译:

二元神经网络的信息容量控制

尽管深度学习技术越来越普及,但高内存需求和功耗实际上限制了它们在移动和物联网领域的应用。虽然二进制卷积网络可以缓解这些问题,但权重的有限位宽通常会导致预测精度显着降低。在本文中,我们提出了一种训练二进制网络的方法,该方法通过将基于Shannon熵的惩罚应用于卷积滤波器,在整个训练过程中维持其信息容量的稳定预定义水平。在SVHN,CIFAR和ImageNet数据集上进行的实验结果表明,该方法可以在统计上显着提高二进制网络的准确性。

更新日期:2020-08-01
down
wechat
bug