当前位置: X-MOL 学术SIAM J. Imaging Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Stochastic Variance Reduced Primal Dual Fixed Point Method for Linearly Constrained Separable Optimization
SIAM Journal on Imaging Sciences ( IF 2.1 ) Pub Date : 2021-09-13 , DOI: 10.1137/20m1354398
Ya-Nan Zhu , Xiaoqun Zhang

SIAM Journal on Imaging Sciences, Volume 14, Issue 3, Page 1326-1353, January 2021.
In this paper we combine the stochastic variance reduced gradient (SVRG) method [R. Johnson and T. Zhang, in Advances in Neural Information Processing Systems 26, 2013, pp. 315--323] with the primal dual fixed point method (PDFP) proposed in [P. Chen, J. Huang, and X. Zhang, Inverse Problems, 29 (2013)] to minimize a sum of two convex functions, one of which is linearly composite. This type of problems typically arise in sparse signal and image reconstruction. The proposed SVRG-PDFP can be seen as a generalization of Prox-SVRG [L. Xiao and T. Zhang, SIAM J. Optim., 24 (2014), pp. 2057--2075] originally designed for the minimization of a sum of two convex functions. Based on some standard assumptions, we propose two variants, one for strongly convex objective functions and the other for the general convex case. Convergence analysis shows that the convergence rate of SVRG-PDFP is $\mathcal{O}(\frac{1}{k})$ (here $k$ is the iteration number) for the general convex objective function and linear for the strongly convex case. Numerical examples on machine learning and computerized tomography image reconstruction are provided to show the effectiveness of the algorithms.


中文翻译:

用于线性约束可分离优化的随机方差减少原始对偶不动点方法

SIAM 成像科学杂志,第 14 卷,第 3 期,第 1326-1353 页,2021 年 1 月。
在本文中,我们结合了随机方差减少梯度(SVRG)方法 [R. Johnson 和 T. Zhang,在神经信息处理系统进展 26,2013 年,第 315--323 页] 与 [P. Chen, J. Huang 和 X. Zhang, Inverse Problems, 29 (2013)] 来最小化两个凸函数的总和,其中一个是线性复合的。此类问题通常出现在稀疏信号和图像重建中。提议的 SVRG-PDFP 可以看作是 Prox-SVRG [L. Xiao 和 T. Zhang, SIAM J. Optim., 24 (2014), pp. 2057--2075] 最初设计用于最小化两个凸函数之和。基于一些标准假设,我们提出了两种变体,一种用于强凸目标函数,另一种用于一般凸情况。收敛分析表明,对于一般凸目标函数,SVRG-PDFP 的收敛速度为 $\mathcal{O}(\frac{1}{k})$(这里 $k$ 为迭代次数),对于强凸目标函数为线性凸案例。提供了关于机器学习和计算机断层扫描图像重建的数值例子来展示算法的有效性。
更新日期:2021-09-13
down
wechat
bug