当前位置: X-MOL 学术arXiv.cs.DS › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparse Convex Optimization via Adaptively Regularized Hard Thresholding
arXiv - CS - Data Structures and Algorithms Pub Date : 2020-06-25 , DOI: arxiv-2006.14571
Kyriakos Axiotis and Maxim Sviridenko

The goal of Sparse Convex Optimization is to optimize a convex function $f$ under a sparsity constraint $s\leq s^*\gamma$, where $s^*$ is the target number of non-zero entries in a feasible solution (sparsity) and $\gamma\geq 1$ is an approximation factor. There has been a lot of work to analyze the sparsity guarantees of various algorithms (LASSO, Orthogonal Matching Pursuit (OMP), Iterative Hard Thresholding (IHT)) in terms of the Restricted Condition Number $\kappa$. The best known algorithms guarantee to find an approximate solution of value $f(x^*)+\epsilon$ with the sparsity bound of $\gamma = O\left(\kappa\min\left\{\log \frac{f(x^0)-f(x^*)}{\epsilon}, \kappa\right\}\right)$, where $x^*$ is the target solution. We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to $\gamma=O(\kappa)$, which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT. This is achieved without significant sacrifice in the runtime efficiency compared to the fastest known algorithms. We also provide a new analysis of OMP with Replacement (OMPR) for general $f$, under the condition $s > s^* \frac{\kappa^2}{4}$, which yields Compressed Sensing bounds under the Restricted Isometry Property (RIP). When compared to other Compressed Sensing approaches, it has the advantage of providing a strong tradeoff between the RIP condition and the solution sparsity, while working for any general function $f$ that meets the RIP condition.

中文翻译:

通过自适应正则化硬阈值的稀疏凸优化

稀疏凸优化的目标是在稀疏约束 $s\leq s^*\gamma$ 下优化凸函数 $f$,其中 $s^*$ 是可行解中非零条目的目标数量(稀疏) 和 $\gamma\geq 1$ 是一个近似因子。已经有很多工作来分析各种算法(LASSO、正交匹配追踪(OMP)、迭代硬阈值(IHT))在受限条件数$\kappa$方面的稀疏性保证。最著名的算法保证找到值 $f(x^*)+\epsilon$ 的近似解,稀疏边界为 $\gamma = O\left(\kappa\min\left\{\log \frac{f (x^0)-f(x^*)}{\epsilon}, \kappa\right\}\right)$,其中 $x^*$ 是目标解。我们提出了一种新的自适应正则化硬阈值 (ARHT) 算法,通过将边界降低到 $\gamma=O(\kappa)$,该算法在这个问题上取得了重大进展,这对于包括套索、OMP 和 IHT。与最快的已知算法相比,这是在没有显着牺牲运行时效率的情况下实现的。我们还提供了对一般 $f$ 的 OMP with Replacement (OMPR) 的新分析,在 $s > s^* \frac{\kappa^2}{4}$ 条件下,这会在受限等距下产生压缩感知边界财产 (RIP)。与其他压缩感知方法相比,它具有在 RIP 条件和解稀疏性之间提供强大权衡的优势,同时适用于满足 RIP 条件的任何通用函数 $f$。与最快的已知算法相比,这是在没有显着牺牲运行时效率的情况下实现的。我们还提供了对一般 $f$ 的 OMP with Replacement (OMPR) 的新分析,在 $s > s^* \frac{\kappa^2}{4}$ 条件下,这会在受限等距下产生压缩感知边界财产 (RIP)。与其他压缩感知方法相比,它具有在 RIP 条件和解稀疏性之间提供强大权衡的优势,同时适用于满足 RIP 条件的任何通用函数 $f$。与最快的已知算法相比,这是在没有显着牺牲运行时效率的情况下实现的。我们还提供了对一般 $f$ 的 OMP with Replacement (OMPR) 的新分析,在 $s > s^* \frac{\kappa^2}{4}$ 条件下,这会在受限等距下产生压缩感知边界财产 (RIP)。与其他压缩感知方法相比,它具有在 RIP 条件和解稀疏性之间提供强大权衡的优势,同时适用于满足 RIP 条件的任何通用函数 $f$。
更新日期:2020-06-26
down
wechat
bug