当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
FlexiBO: Cost-Aware Multi-Objective Optimization of Deep Neural Networks
arXiv - CS - Machine Learning Pub Date : 2020-01-18 , DOI: arxiv-2001.06588
Md Shahriar Iqbal, Jianhai Su, Lars Kotthoff, Pooyan Jamshidi

One of the key challenges in designing machine learning systems is to determine the right balance amongst several objectives, which also oftentimes are incommensurable and conflicting. For example, when designing deep neural networks (DNNs), one often has to trade-off between multiple objectives, such as accuracy, energy consumption, and inference time. Typically, there is no single configuration that performs equally well for all objectives. Consequently, one is interested in identifying Pareto-optimal designs. Although different multi-objective optimization algorithms have been developed to identify Pareto-optimal configurations, state-of-the-art multi-objective optimization methods do not consider the different evaluation costs attending the objectives under consideration. This is particularly important for optimizing DNNs: the cost arising on account of assessing the accuracy of DNNs is orders of magnitude higher than that of measuring the energy consumption of pre-trained DNNs. We propose FlexiBO, a flexible Bayesian optimization method, to address this issue. We formulate a new acquisition function based on the improvement of the Pareto hyper-volume weighted by the measurement cost of each objective. Our acquisition function selects the next sample and objective that provides maximum information gain per unit of cost. We evaluated FlexiBO on 7 state-of-the-art DNNs for object detection, natural language processing, and speech recognition. Our results indicate that, when compared to other state-of-the-art methods across the 7 architectures we tested, the Pareto front obtained using FlexiBO has, on average, a 28.44% higher contribution to the true Pareto front and achieves 25.64% better diversity.

中文翻译:

FlexiBO:深度神经网络的成本感知多目标优化

设计机器学习系统的关键挑战之一是确定多个目标之间的正确平衡,这些目标通常也是不可衡量和相互冲突的。例如,在设计深度神经网络 (DNN) 时,通常必须在多个目标之间进行权衡,例如准确性、能耗和推理时间。通常,没有一种配置可以对所有目标表现同样出色。因此,人们对确定帕累托最优设计感兴趣。尽管已经开发了不同的多目标优化算法来识别帕累托最优配置,但最先进的多目标优化方法并未考虑与所考虑目标相关的不同评估成本。这对于优化 DNN 尤其重要:由于评估 DNN 的准确性而产生的成本比测量预训练 DNN 的能耗高出几个数量级。我们提出了灵活的贝叶斯优化方法 FlexiBO 来解决这个问题。我们基于每个目标的测量成本加权的帕累托超体积的改进制定了一个新的获取函数。我们的获取功能选择下一个样本和目标,以提供每单位成本的最大信息增益。我们在 7 个最先进的 DNN 上评估了 FlexiBO,用于对象检测、自然语言处理和语音识别。我们的结果表明,与我们测试的 7 种架构中的其他最先进方法相比,使用 FlexiBO 获得的帕累托前沿平均为 28。
更新日期:2020-01-22
down
wechat
bug