当前位置: X-MOL 学术Comput. Optim. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A new method based on the proximal bundle idea and gradient sampling technique for minimizing nonsmooth convex functions
Computational Optimization and Applications ( IF 2.2 ) Pub Date : 2020-07-20 , DOI: 10.1007/s10589-020-00213-y
M. Maleknia , M. Shamsi

In this paper, we combine the positive aspects of the gradient sampling (GS) and bundle methods, as the most efficient methods in nonsmooth optimization, to develop a robust method for solving unconstrained nonsmooth convex optimization problems. The main aim of the proposed method is to take advantage of both GS and bundle methods, meanwhile avoiding their drawbacks. At each iteration of this method, to find an efficient descent direction, the GS technique is utilized for constructing a local polyhedral model for the objective function. If necessary, via an iterative improvement process, this initial polyhedral model is improved by some techniques inspired by the bundle and GS methods. The convergence of the method is studied, which reveals that the global convergence property of our method is independent of the number of gradient evaluations required to establish and improve the initial polyhedral models. Thus, the presented method needs much fewer gradient evaluations in comparison to the original GS method. Furthermore, by means of numerical simulations, we show that the presented method provides promising results in comparison with GS methods, especially for large scale problems. Moreover, in contrast with some bundle methods, our method is not very sensitive to the accuracy of supplied gradients.

中文翻译:

基于近端束思想和梯度采样技术的最小化非光滑凸函数的新方法

在本文中,我们结合了梯度采样(GS)和成束方法的积极方面,这是非光滑优化中最有效的方法,以开发出一种鲁棒的方法来解决无约束非光滑凸优化问题。提出的方法的主要目的是同时利用GS和bundle方法,同时避免它们的缺点。在此方法的每次迭代中,为了找到有效的下降方向,都采用GS技术为目标函数构建局部多面体模型。如有必要,可以通过迭代改进过程,通过受bundle和GS方法启发的某些技术来改进此初始多面体模型。研究了该方法的收敛性,这表明我们方法的全局收敛性与建立和改进初始多面体模型所需的梯度评估次数无关。因此,与原始GS方法相比,提出的方法需要更少的梯度评估。此外,通过数值模拟,我们表明,与GS方法相比,所提出的方法提供了有希望的结果,特别是对于大规模问题。而且,与某些捆绑方法相比,我们的方法对所提供梯度的准确性不是很敏感。我们表明,与GS方法相比,本文提出的方法可提供有希望的结果,尤其是对于大规模问题。而且,与某些捆绑方法相比,我们的方法对提供的梯度的准确性不是很敏感。我们表明,与GS方法相比,本文提出的方法可提供有希望的结果,尤其是对于大规模问题。而且,与某些捆绑方法相比,我们的方法对提供的梯度的准确性不是很敏感。
更新日期:2020-07-20
down
wechat
bug