当前位置: X-MOL 学术J. Sci. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Variable Smoothing for Convex Optimization Problems Using Stochastic Gradients
Journal of Scientific Computing ( IF 2.5 ) Pub Date : 2020-10-22 , DOI: 10.1007/s10915-020-01332-8
Radu Ioan Boţ , Axel Böhm

We aim to solve a structured convex optimization problem, where a nonsmooth function is composed with a linear operator. When opting for full splitting schemes, usually, primal–dual type methods are employed as they are effective and also well studied. However, under the additional assumption of Lipschitz continuity of the nonsmooth function which is composed with the linear operator we can derive novel algorithms through regularization via the Moreau envelope. Furthermore, we tackle large scale problems by means of stochastic oracle calls, very similar to stochastic gradient techniques. Applications to total variational denoising and deblurring, and matrix factorization are provided.



中文翻译:

使用随机梯度的凸优化问题的变量平滑

我们旨在解决结构化凸优化问题,其中非光滑函数由线性算子组成。当选择了完全拆分方案,通常采用原对偶式的方法,因为它们是有效的,也是很好的研究。然而,在由线性算子组成的非光滑函数的Lipschitz连续性的附加假设下,我们可以通过Moreau包络通过正则化来得出新颖的算法。此外,我们通过随机oracle调用来解决大规模问题,这与随机梯度技术非常相似。提供了对总变分去噪和去模糊以及矩阵分解的应用。

更新日期:2020-10-27
down
wechat
bug