当前位置: X-MOL 学术J. Comput. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Bayesian optimization with output-weighted optimal sampling
Journal of Computational Physics ( IF 3.8 ) Pub Date : 2020-10-08 , DOI: 10.1016/j.jcp.2020.109901
Antoine Blanchard , Themistoklis Sapsis

In Bayesian optimization, accounting for the importance of the output relative to the input is a crucial yet challenging exercise, as it can considerably improve the final result but often involves inaccurate and cumbersome entropy estimations. We approach the problem from the perspective of importance-sampling theory, and advocate the use of the likelihood ratio to guide the search algorithm towards regions of the input space where the objective function to minimize assumes abnormally small values. The likelihood ratio acts as a sampling weight and can be computed at each iteration without severely deteriorating the overall efficiency of the algorithm. In particular, it can be approximated in a way that makes the approach tractable in high dimensions. The “likelihood-weighted” acquisition functions introduced in this work are found to outperform their unweighted counterparts in a number of applications.



中文翻译:

带输出加权最优采样的贝叶斯优化

在贝叶斯优化中,考虑输出相对于输入的重要性是一项至关重要的挑战性工作,因为它可以大大改善最终结果,但通常涉及不准确且繁琐的熵估计。我们从重要性抽样理论的角度解决这个问题,并提倡使用似然比将搜索算法引向输入空间的区域,在该区域中,目标函数最小化假定值异常小。似然比充当采样权重,并且可以在每次迭代时进行计算,而不会严重降低算法的整体效率。特别地,可以以使得该方法在高维度上易于处理的方式来近似。

更新日期:2020-12-01
down
wechat
bug