当前位置: X-MOL 学术Mach. Learn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Ada-boundary: accelerating DNN training via adaptive boundary batch selection
Machine Learning ( IF 4.3 ) Pub Date : 2020-09-01 , DOI: 10.1007/s10994-020-05903-6
Hwanjun Song , Sundong Kim , Minseok Kim , Jae-Gil Lee

Neural networks converge faster with help from a smart batch selection strategy. In this regard, we propose Ada-Boundary, a novel and simple adaptive batch selection algorithm that constructs an effective mini-batch according to the learning progress of the model. Our key idea is to exploit confusing samples for which the model cannot predict labels with high confidence. Thus, samples near the current decision boundary are considered to be the most effective for expediting convergence. Taking advantage of this design, Ada-Boundary maintained its dominance for various degrees of training difficulty. We demonstrate the advantage of Ada-Boundary by extensive experimentation using CNNs with five benchmark data sets. Ada-Boundary was shown to produce a relative improvement in test errors by up to 31.80% compared with the baseline for a fixed wall-clock training time, thereby achieving a faster convergence speed.

中文翻译:

Ada-boundary:通过自适应边界批量选择加速 DNN 训练

在智能批量选择策略的帮助下,神经网络收敛速度更快。在这方面,我们提出了 Ada-Boundary,这是一种新颖且简单的自适应批次选择算法,可根据模型的学习进度构建有效的小批次。我们的关键思想是利用模型无法高置信度预测标签的混淆样本。因此,当前决策边界附近的样本被认为对加速收敛最有效。利用这种设计,Ada-Boundary 在不同程度的训练难度下保持了其主导地位。我们通过使用具有五个基准数据集的 CNN 的广泛实验证明了 Ada-Boundary 的优势。事实证明,Ada-Boundary 可以将测试错误相对提高 31 倍。
更新日期:2020-09-01
down
wechat
bug