当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Additive Tree-Structured Conditional Parameter Spaces in Bayesian Optimization: A Novel Covariance Function and a Fast Implementation
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 20.8 ) Pub Date : 2020-09-22 , DOI: 10.1109/tpami.2020.3026019
Xingchen Ma , Matthew B. Blaschko

Bayesian optimization (BO) is a sample-efficient global optimization algorithm for black-box functions which are expensive to evaluate. Existing literature on model based optimization in conditional parameter spaces are usually built on trees. In this work, we generalize the additive assumption to tree-structured functions and propose an additive tree-structured covariance function, showing improved sample-efficiency, wider applicability and greater flexibility. Furthermore, by incorporating the structure information of parameter spaces and the additive assumption in the BO loop, we develop a parallel algorithm to optimize the acquisition function and this optimization can be performed in a low dimensional space. We demonstrate our method on an optimization benchmark function, on a neural network compression problem and on pruning pre-trained VGG16 and ResNet50 models. Experimental results show our approach significantly outperforms the current state of the art for conditional parameter optimization including SMAC, TPE and Jenatton et al. (2017).

中文翻译:


贝叶斯优化中的加法树结构条件参数空间:一种新颖的协方差函数和快速实现



贝叶斯优化 (BO) 是一种样本有效的全局优化算法,适用于评估成本高昂的黑盒函数。现有的关于条件参数空间中基于模型的优化的文献通常建立在树上。在这项工作中,我们将加性假设推广到树结构函数,并提出了加性树结构协方差函数,显示出更高的样本效率、更广泛的适用性和更大的灵活性。此外,通过结合参数空间的结构信息和BO循环中的加性假设,我们开发了一种并行算法来优化采集函数,并且这种优化可以在低维空间中执行。我们在优化基准函数、神经网络压缩问题以及修剪预训练的 VGG16 和 ResNet50 模型上展示了我们的方法。实验结果表明,我们的方法明显优于当前条件参数优化的最新技术,包括 SMAC、TPE 和 Jenatton 等人。 (2017)。
更新日期:2020-09-22
down
wechat
bug