当前位置: X-MOL 学术Comput. Optim. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Generating set search using simplex gradients for bound-constrained black-box optimization
Computational Optimization and Applications ( IF 1.6 ) Pub Date : 2021-02-27 , DOI: 10.1007/s10589-021-00267-6
Sander Dedoncker , Wim Desmet , Frank Naets

The optimization problems arising in modern engineering practice are increasingly simulation-based, characterized by extreme types of nonsmoothness, the inaccessibility of derivatives, and high computational expense. While generating set searches (GSS) generally offer a satisfying level of robustness and converge to stationary points, the convergence rates may be slow. In order to accelerate the solution process without sacrificing robustness, we introduce (simplex) gradient-informed generating set search (GIGS) methods for solving bound-constrained minimization problems. These algorithms use simplex gradients, acquired over several iterations, as guidance for adapting the search stencil to the local topography of the objective function. GIGS is shown to inherit first-order convergence properties of GSS and to possess a natural tendency for avoiding saddle points. Numerical experiments are performed on an academic set of smooth, nonsmooth and noisy test problems, as well as a realistic engineering case study. The results demonstrate that including simplex gradient information enables computational cost savings over non-adaptive GSS methods.



中文翻译:

使用单纯形梯度生成集合搜索以进行有界约束的黑盒优化

现代工程实践中出现的优化问题越来越多地基于仿真,其特征是极端类型的非光滑性,导数的不可及性以及高计算量。虽然生成集搜索(GSS)通常提供令人满意的鲁棒性并收敛到固定点,但是收敛速度可能很慢。为了在不牺牲鲁棒性的情况下加速求解过程,我们引入了(简化的)梯度信息生成集搜索(GIGS)方法来解决约束受限的最小化问题。这些算法使用在多个迭代中获取的单纯形梯度作为将搜索模板调整为目标函数局部拓扑的指导。GIGS被证明继承了GSS的一阶收敛特性,并具有避免鞍点的自然趋势。数值实验是针对一系列光滑,不光滑和有噪声的测试问题进行的,并进行了实际的工程案例研究。结果表明,与非自适应GSS方法相比,包含单纯形梯度信息可以节省计算成本。

更新日期:2021-04-01
down
wechat
bug