当前位置: X-MOL 学术Optim. Eng. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Strong convergence of subgradient extragradient method with regularization for solving variational inequalities
Optimization and Engineering ( IF 2.0 ) Pub Date : 2020-07-24 , DOI: 10.1007/s11081-020-09540-9
Dang Van Hieu , Pham Ky Anh , Le Dung Muu

The paper concerns with the two numerical methods for approximating solutions of a monotone and Lipschitz variational inequality problem in a Hilbert space. We here describe how to incorporate regularization terms in the projection method, and then establish the strong convergence of the resulting methods under certain conditions imposed on regularization parameters. The new methods work in both cases of with or without knowing previously the Lipschitz constant of cost operator. Using the regularization aims mainly to obtain the strong convergence of the methods which is different to the known hybrid projection or viscosity-type methods. The effectiveness of the new methods over existing ones is also illustrated by several numerical experiments.



中文翻译:

带正则化的次梯度超梯度方法的强收敛性,用于解决变分不等式

本文涉及在希尔伯特空间中逼近单调和Lipschitz变分不等式问题的两种数值方法。我们在这里描述如何将正则化项合并到投影方法中,然后在强加给正则化参数的特定条件下建立所得方法的强收敛性。无论以前是否知道成本算子的Lipschitz常数,新方法都可以使用。使用正则化的目的主要是要获得与已知的混合投影或粘度类型方法不同的方法的强收敛性。若干数值实验也说明了新方法相对于现有方法的有效性。

更新日期:2020-07-24
down
wechat
bug