当前位置: X-MOL 学术Optim. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Aggregate subgradient method for nonsmooth DC optimization
Optimization Letters ( IF 1.3 ) Pub Date : 2020-05-14 , DOI: 10.1007/s11590-020-01586-z
Adil M. Bagirov , Sona Taheri , Kaisa Joki , Napsu Karmitsa , Marko M. Mäkelä

The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers.



中文翻译:

聚合次梯度法用于非平滑直流优化

提出了聚合次梯度法来解决凸(DC)优化问题的无约束非光滑差异。所提出的方法与次梯度方法和bundle方法都有一些相似之处。聚合次梯度定义为在两个严重步骤之间的零步长处计算的次梯度的凸组合。在每次迭代中,仅使用两个子梯度来找到搜索方向:聚合子梯度和在当前空步长处计算出的子梯度。实践证明,该方法收敛到直流优化问题的关键点,并且两个严重步骤之间的空步数是有限的。使用一些学术测试问题对新方法进行了测试,并与其他几种非平滑直流优化求解器进行了比较。

更新日期:2020-05-14
down
wechat
bug