当前位置: X-MOL 学术Int. J. Pattern Recognit. Artif. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Layer-Wise Ensemble Technique for Binary Neural Network
International Journal of Pattern Recognition and Artificial Intelligence ( IF 0.9 ) Pub Date : 2021-03-05 , DOI: 10.1142/s021800142152011x
Jiazhen Xi 1 , Hiroyuki Yamauchi 1
Affiliation  

Binary neural networks (BNNs) have drawn much attention because of the most promising techniques to meet the desired memory footprint and inference speed requirements. However, they still suffer from the severe intrinsic instability of the error convergence, resulting in increase in prediction error and its standard deviation, which is mostly caused by the inherently poor representation with only two possible values of 1 and +1. In this work, we have proposed a cost-aware layer-wise ensemble method to address the above issue without incurring any excessive costs, which is characterized by (1) layer-wise bagging and (2) cost-aware layer selection for the bagging. One of the experimental results has shown that the proposed method reduces the error and its standard deviation by 15% and 54% on CIFAR-10, respectively, compared to the BNN serving as a baseline. This paper demonstrated and discussed such error reduction and stability performance with high versatility based on the comparison results under the various cases of combinations of the network base model with the proposed and the state-of-the-art prior techniques while changing the network sizes and datasets of CIFAR-10, SVHN, and MNIST for the evaluation.

中文翻译:

二元神经网络的分层集成技术

二元神经网络 (BNN) 因其最有希望的技术来满足所需的内存占用和推理速度要求而备受关注。然而,它们仍然受到误差收敛的严重内在不稳定性的影响,导致预测误差及其标准差增加,这主要是由于固有的较差表示,只有两个可能的值-1 和+1. 在这项工作中,我们提出了一种成本感知的逐层集成方法来解决上述问题,而不会产生任何额外的成本,其特点是(1)逐层装袋和(2)成本感知层选择装袋。一项实验结果表明,与作为基线的 BNN 相比,所提出的方法在 CIFAR-10 上的误差及其标准偏差分别降低了 15% 和 54%。本文基于网络基础模型与所提出的现有技术和现有技术的各种组合情况下的比较结果,同时改变网络规模和用于评估的 CIFAR-10、SVHN 和 MNIST 数据集。
更新日期:2021-03-05
down
wechat
bug