当前位置: X-MOL 学术Optim. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
Optimization Letters ( IF 1.3 ) Pub Date : 2021-04-07 , DOI: 10.1007/s11590-021-01723-2
Yuncheng Liu , Fuquan Xia

In this paper, we focus on the problem of minimizing the sum of nonconvex smooth component functions and a nonsmooth weakly convex function composed with a linear operator. One specific application is logistic regression problems with weakly convex regularizers that introduce better sparsity than the standard convex regularizers. Based on the Moreau envelope with a decreasing sequence of smoothing parameters as well as incremental aggregated gradient method, we propose a variable smoothing incremental aggregated gradient (VS-IAG) algorithm. We also prove a complexity of \({\mathcal {O}}(\epsilon ^{-3})\) to achieve an \(\epsilon \)-approximate solution.



中文翻译:

非光滑非凸正则化优化的变平滑增量聚合梯度法

在本文中,我们关注于最小化由线性算子组成的非凸光滑分量函数和非光滑弱凸函数之和的问题。一种特殊的应用是弱凸正则化器的逻辑回归问题,该问题比标准凸正则化器引入了更好的稀疏性。基于具有递减顺序的平滑参数的Moreau包络以及增量聚合梯度方法,我们提出了一种可变的平滑增量聚合梯度(VS-IAG)算法。我们还证明了\({\ mathcal {O}}(\ epsilon ^ {-3})\)的复杂度,以实现\(\ epsilon \)近似解。

更新日期:2021-04-08
down
wechat
bug