当前位置: X-MOL 学术J. Glob. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Decomposition in derivative-free optimization
Journal of Global Optimization ( IF 1.3 ) Pub Date : 2021-07-07 , DOI: 10.1007/s10898-021-01051-w
Kaiwen Ma 1 , Nikolaos V. Sahinidis 2 , Sreekanth Rajagopalan 3 , Satyajith Amaran 3 , Scott J Bury 4
Affiliation  

This paper proposes a novel decomposition framework for derivative-free optimization (DFO) algorithms. Our framework significantly extends the scope of current DFO solvers to larger-scale problems. We show that the proposed framework closely relates to the superiorization methodology that is traditionally used for improving the efficiency of feasibility-seeking algorithms for constrained optimization problems in a derivative-based setting. We analyze the convergence behavior of the framework in the context of global search algorithms. A practical implementation is developed and exemplified with the global model-based solver Stable Noisy Optimization by Branch and Fit (SNOBFIT) [36]. To investigate the decomposition framework’s performance, we conduct extensive computational studies on a collection of over 300 test problems of varying dimensions and complexity. We observe significant improvements in the quality of solutions for a large fraction of the test problems. Regardless of problem convexity and smoothness, decomposition leads to over 50% improvement in the objective function after 2500 function evaluations for over 90% of our test problems with more than 75 variables.



中文翻译:

无导数优化中的分解

本文提出了一种新的无导数优化 (DFO) 算法分解框架。我们的框架将当前 DFO 求解器的范围显着扩展到更大规模的问题。我们表明,所提出的框架与传统上用于提高基于导数设置中的约束优化问题的可行性寻求算法的效率的优越化方法密切相关。我们在全局搜索算法的上下文中分析了框架的收敛行为。使用基于全局模型的求解器 Stable Noisy Optimization by Branch and Fit (SNOBFIT) [36] 开发并举例说明了实际实现。为了研究分解框架的性能,我们对 300 多个不同维度和复杂性的测试问题的集合进行了广泛的计算研究。我们观察到大部分测试问题的解决方案质量有了显着提高。无论问题的凸性和平滑性如何,在对超过 75 个变量的超过 90% 的测试问题进行 2500 次函数评估后,分解都会使目标函数提高 50% 以上。

更新日期:2021-07-07
down
wechat
bug