当前位置: X-MOL 学术Mach. Learn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
High-dimensional Bayesian optimization using low-dimensional feature spaces
Machine Learning ( IF 4.3 ) Pub Date : 2020-09-01 , DOI: 10.1007/s10994-020-05899-z
Riccardo Moriconi , Marc Peter Deisenroth , K. S. Sesh Kumar

Bayesian optimization (BO) is a powerful approach for seeking the global optimum of expensive black-box functions and has proven successful for fine tuning hyper-parameters of machine learning models. However, BO is practically limited to optimizing 10–20 parameters. To scale BO to high dimensions, we usually make structural assumptions on the decomposition of the objective and/or exploit the intrinsic lower dimensionality of the problem, e.g. by using linear projections. We could achieve a higher compression rate with nonlinear projections, but learning these nonlinear embeddings typically requires much data. This contradicts the BO objective of a relatively small evaluation budget. To address this challenge, we propose to learn a low-dimensional feature space jointly with (a) the response surface and (b) a reconstruction mapping. Our approach allows for optimization of BO’s acquisition function in the lower-dimensional subspace, which significantly simplifies the optimization problem. We reconstruct the original parameter space from the lower-dimensional subspace for evaluating the black-box function. For meaningful exploration, we solve a constrained optimization problem.

中文翻译:

使用低维特征空间的高维贝叶斯优化

贝叶斯优化 (BO) 是一种强大的方法,用于寻找昂贵的黑盒函数的全局最优解,并已被证明可成功微调机器学习模型的超参数。然而,BO 实际上仅限于优化 10-20 个参数。为了将 BO 扩展到高维,我们通常对目标的分解做出结构性假设和/或利用问题的内在低维,例如通过使用线性投影。我们可以通过非线性投影实现更高的压缩率,但学习这些非线性嵌入通常需要大量数据。这与相对较小的评估预算的 BO 目标相矛盾。为了应对这一挑战,我们建议与(a)响应面和(b)重建映射一起学习低维特征空间。我们的方法允许在低维子空间中优化 BO 的获取函数,这大大简化了优化问题。我们从低维子空间重建原始参数空间以评估黑盒函数。为了进行有意义的探索,我们解决了一个约束优化问题。
更新日期:2020-09-01
down
wechat
bug