当前位置: X-MOL 学术J. Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Novel local tuning techniques for speeding up one-dimensional algorithms in expensive global optimization using Lipschitz derivatives
Journal of Computational and Applied Mathematics ( IF 2.4 ) Pub Date : 2020-08-06 , DOI: 10.1016/j.cam.2020.113134
Yaroslav D. Sergeyev , Maria Chiara Nasso , Marat S. Mukhametzhanov , Dmitri E. Kvasov

Lipschitz global optimization is an important research field with numerous applications in engineering, electronics, machine learning, optimal decision making, etc. In many of these applications, even in the univariate case, evaluations of the objective functions and derivatives are often time consuming and the number of function evaluations executed by algorithms is extremely high due to the presence of multiple local extrema. As a result, the problem of an acceleration of the global search arises inevitably. In this paper, ideas allowing one to speed up the global search in cases where the objective function has the first derivative satisfying the Lipschitz condition are proposed. Several methods using a priori known global Lipschitz constants, their global and local dynamic estimates are presented. Two new algorithms using smooth piece-wise quadratic support functions are introduced and their convergence conditions are established. All the methods are implemented both in the traditional floating-point arithmetic and in the Infinity Computing framework allowing one to efficiently compute exact derivatives in the case where the optimized function is given as a black box. Numerical experiments executed on two classes of randomly generated test functions show a promising behavior of global optimization methods using the introduced local tuning techniques for speeding up the process of the global search.



中文翻译:

使用Lipschitz导数的新型本地调整技术,可在昂贵的全局优化中加速一维算法

Lipschitz全局优化是一个重要的研究领域,在工程,电子,机器学习,最佳决策等方面具有众多应用。在许多应用中,即使在单变量情况下,目标函数和导数的求值也通常很耗时,并且由于存在多个局部极值,算法执行的功能评估的数量非常高。结果,不可避免地出现了全局搜索加速的问题。本文提出了在目标函数具有满足Lipschitz条件的一阶导数的情况下加快全局搜索速度的想法。介绍了使用先验已知全局Lipschitz常数的几种方法,以及它们的全局和局部动态估计。介绍了两种使用平滑分段二次支持函数的新算法,并建立了它们的收敛条件。所有方法均在传统浮点算法和无穷大计算框架中实现,从而在优化功能以黑盒形式给出的情况下,可以高效地计算精确的导数。在两类随机生成的测试函数上执行的数值实验表明,使用引入的局部调整技术来加快全局搜索过程的全局优化方法的行为很有希望。所有方法均在传统浮点算法和无穷大计算框架中实现,从而在优化功能以黑盒形式给出的情况下,可以高效地计算精确的导数。在两类随机生成的测试函数上执行的数值实验表明,使用引入的局部调整技术来加快全局搜索过程的全局优化方法的行为很有希望。所有方法均在传统浮点算法和无穷大计算框架中实现,从而在优化功能以黑盒形式给出的情况下,可以高效地计算精确的导数。在两类随机生成的测试函数上执行的数值实验表明,使用引入的局部调整技术来加快全局搜索过程的全局优化方法的行为很有希望。

更新日期:2020-08-06
down
wechat
bug