当前位置: X-MOL 学术Optimization › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A proximal gradient splitting method for solving convex vector optimization problems
Optimization ( IF 2.2 ) Pub Date : 2020-07-29 , DOI: 10.1080/02331934.2020.1800699
Yunier Bello-Cruz 1 , Jefferson G. Melo 2 , Ray V.G. Serra 2
Affiliation  

In this paper, a proximal gradient splitting method for solving nondifferentiable vector optimization problems is proposed. The convergence analysis is carried out when the objective function is the sum of two convex functions where one of them is assumed to be continuously differentiable. The proposed splitting method exhibits full convergence to a weakly efficient solution without assuming the Lipschitz continuity of the Jacobian of the differentiable component. To carry this analysis, the popular Beck–Teboulle's line-search procedure is extended to the vectorial setting under mild assumptions. It is also shown that the proposed scheme obtains an ϵ-approximate solution to the vector optimization problem in at most O(1/ε) iterations.



中文翻译:

一种求解凸向量优化问题的近端梯度分裂法

本文提出了一种求解不可微向量优化问题的近端梯度分裂方法。当目标函数是两个凸函数之和时,进行收敛分析,其中假设其中一个是连续可微的。所提出的分裂方法在不假设可微分量的雅可比行列式的 Lipschitz 连续性的情况下表现出对弱有效解的完全收敛。为了进行这种分析,流行的 Beck-Teboulle 线搜索程序在温和的假设下扩展到矢量设置。还表明,所提出的方案最多(1/ε) 迭代。

更新日期:2020-07-29
down
wechat
bug