当前位置: X-MOL 学术J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Variance-Based Subgradient Extragradient Method for Stochastic Variational Inequality Problems
Journal of Scientific Computing ( IF 2.5 ) Pub Date : 2021-08-18 , DOI: 10.1007/s10915-021-01603-y
Zhen-Ping Yang 1, 2 , Jin Zhang 3 , Yuliang Wang 4, 5 , Gui-Hua Lin 2
Affiliation  

In this paper, we propose a variance-based subgradient extragradient algorithm with line search for stochastic variational inequality problems by aiming at robustness with respect to an unknown Lipschitz constant. This algorithm may be regarded as an integration of a subgradient extragradient algorithm for deterministic variational inequality problems and a stochastic approximation method for expected values. At each iteration, different from the conventional variance-based extragradient algorithms to take projection onto the feasible set twicely, our algorithm conducts a subgradient projection which can be calculated explicitly. Since our algorithm requires only one projection at each iteration, the computation load may be reduced. We discuss the asymptotic convergence, the sublinear convergence rate in terms of the mean natural residual function, and the optimal oracle complexity for the proposed algorithm. Furthermore, we establish the linear convergence rate with finite computational budget under both the strongly Minty variational inequality and the error bound condition. Preliminary numerical experiments indicate that the proposed algorithm is competitive with some existing methods.



中文翻译:

随机变分不等式问题的基于方差的次梯度超梯度方法

在本文中,我们提出了一种基于方差的次梯度超梯度算法,通过针对未知的 Lipschitz 常数的鲁棒性来解决随机变分不等式问题的线搜索。该算法可以看作是确定性变分不等式问题的次梯度超梯度算法和期望值的随机逼近方法的集成。在每次迭代中,不同于传统的基于方差的超梯度算法对可行集进行两次投影,我们的算法进行了一次可以显式计算的次梯度投影。由于我们的算法在每次迭代中只需要一个投影,因此可以减少计算负载。我们讨论渐近收敛,根据平均自然残差函数的次线性收敛速度,以及所提出算法的最佳预言机复杂度。此外,我们在强 Minty 变分不等式和误差界限条件下建立了具有有限计算预算的线性收敛率。初步的数值实验表明,该算法与现有的一些方法具有竞争力。

更新日期:2021-08-19
down
wechat
bug