当前位置: X-MOL 学术J. Glob. Optim. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Accelerated iterative hard thresholding algorithm for $$l_0$$l0 regularized regression problem
Journal of Global Optimization ( IF 1.3 ) Pub Date : 2019-08-29 , DOI: 10.1007/s10898-019-00826-6
Fan Wu , Wei Bian

In this paper, we propose an accelerated iterative hard thresholding algorithm for solving the \(l_0\) regularized box constrained regression problem. We substantiate that there exists a threshold, if the extrapolation coefficients are chosen below this threshold, the proposed algorithm is equivalent to the accelerated proximal gradient algorithm for solving a corresponding constrained convex problem after finite iterations. Under some proper conditions, we get that the sequence generated by the proposed algorithm is convergent to a local minimizer of the \(l_0\) regularized problem, which satisfies a desired lower bound. Moreover, when the data fitting function satisfies the error bound condition, we prove that both the iterate sequence and the corresponding sequence of objective function values are R-linearly convergent. Finally, we use several numerical experiments to verify our theoretical results.



中文翻译:

$$ l_0 $$ l0正则化回归问题的加速迭代硬阈值算法

在本文中,我们提出了一种加速迭代硬阈值算法,用于解决\(l_0 \)正则化框约束回归问题。我们证实存在一个阈值,如果在该阈值以下选择外推系数,则该算法等效于有限迭代后求解相应约束凸问题的加速近端梯度算法。在某些适当的条件下,我们得到了该算法产生的序列收敛于\(l_0 \)的局部极小值满足期望下界的正则化问题。此外,当数据拟合函数满足误差限制条件时,我们证明目标函数值的迭代序列和相应序列都是R线性收敛的。最后,我们使用几个数值实验来验证我们的理论结果。

更新日期:2020-04-21
down
wechat
bug