当前位置: X-MOL 学术J. Optim. Theory Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Projected Extrapolated Gradient Method with Larger Step Size for Monotone Variational Inequalities
Journal of Optimization Theory and Applications ( IF 1.6 ) Pub Date : 2021-07-18 , DOI: 10.1007/s10957-021-01902-2
Xiaokai Chang 1 , Jianchao Bai 2
Affiliation  

A projected extrapolated gradient method is designed for solving monotone variational inequality in Hilbert space. Requiring local Lipschitz continuity of the operator, our proposed method improves the value of the extrapolated parameter and admits larger step sizes, which are predicted based a local information of the involved operator and corrected by bounding the distance between each pair of successive iterates. The correction will be implemented when the distance is larger than a given constant and its main cost is to compute a projection onto the feasible set. In particular, when the operator is the gradient of a convex function, the correction step is not necessary. We establish the convergence and ergodic convergence rate in theory under the larger range of parameters. Related numerical experiments illustrate the improvements in efficiency from the larger step sizes.



中文翻译:

单调变分不等式的大步长投影外推梯度法

设计了一种投影外推梯度方法来解决希尔伯特空间中的单调变分不等式。需要算子的局部 Lipschitz 连续性,我们提出的方法提高了外推参数的值并允许更大的步长,这些步长是基于所涉及的算子的局部信息进行预测的,并通过限制每对连续迭代之间的距离进行校正。当距离大于给定的常数并且其主要成本是计算可行集上的投影时,将实施校正。特别是当算子是凸函数的梯度时,不需要校正步骤。我们在较大的参数范围内建立了理论上的收敛和遍历收敛率。

更新日期:2021-07-19
down
wechat
bug