当前位置: X-MOL 学术J. Optim. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Gradient Sampling Method Based on Ideal Direction for Solving Nonsmooth Optimization Problems
Journal of Optimization Theory and Applications ( IF 1.6 ) Pub Date : 2020-08-31 , DOI: 10.1007/s10957-020-01740-8
Morteza Maleknia , Mostafa Shamsi

In this paper, a modification to the original gradient sampling method for minimizing nonsmooth nonconvex functions is presented. One computational component in the gradient sampling method is the need to solve a quadratic optimization problem at each iteration, which may result in a time-consuming process, especially for large-scale objectives. To resolve this difficulty, this study proposes a new descent direction, for which there is no need to consider any quadratic or linear subproblem. It is shown that this direction satisfies the Armijo step size condition. We also prove that under proposed modifications, the global convergence of the gradient sampling method is preserved. Moreover, under some moderate assumptions, an upper bound for the number of serious iterations is presented. Using this upper bound, we develop a different strategy to study the convergence of the method. We also demonstrate the efficiency of the proposed method using small-, medium- and large-scale problems in our numerical experiments.

中文翻译:

求解非光滑优化问题的一种基于理想方向的梯度采样方法

在本文中,提出了对原始梯度采样方法的修改,以最小化非光滑非凸函数。梯度采样方法中的一个计算组件是需要在每次迭代时解决二次优化问题,这可能会导致一个耗时的过程,尤其是对于大规模目标。为了解决这个难题,本研究提出了一个新的下降方向,不需要考虑任何二次或线性子问题。表明该方向满足Armijo步长条件。我们还证明,在提出的修改下,梯度采样方法的全局收敛性得以保留。此外,在一些适度的假设下,给出了严重迭代次数的上限。使用这个上限,我们开发了一种不同的策略来研究该方法的收敛性。我们还在数值实验中使用小型、中型和大型问题证明了所提出方法的效率。
更新日期:2020-08-31
down
wechat
bug