当前位置: X-MOL 学术Math. Program. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Discrete optimization methods for group model selection in compressed sensing
Mathematical Programming ( IF 2.2 ) Pub Date : 2020-06-13 , DOI: 10.1007/s10107-020-01529-7
Bubacarr Bah , Jannis Kurtz , Oliver Schaudt

In this article we study the problem of signal recovery for group models. More precisely for a given set of groups, each containing a small subset of indices, and for given linear sketches of the true signal vector which is known to be group-sparse in the sense that its support is contained in the union of a small number of these groups, we study algorithms which successfully recover the true signal just by the knowledge of its linear sketches. We derive model projection complexity results and algorithms for more general group models than the state-of-the-art. We consider two versions of the classical Iterative Hard Thresholding algorithm (IHT). The classical version iteratively calculates the exact projection of a vector onto the group model, while the approximate version (AM-IHT) uses a head- and a tail-approximation iteratively. We apply both variants to group models and analyse the two cases where the sensing matrix is a Gaussian matrix and a model expander matrix. To solve the exact projection problem on the group model, which is known to be equivalent to the maximum weight coverage problem, we use discrete optimization methods based on dynamic programming and Benders' Decomposition. The head- and tail-approximations are derived by a classical greedy-method and LP-rounding, respectively.

中文翻译:

压缩感知中组模型选择的离散优化方法

在本文中,我们研究组模型的信号恢复问题。更准确地说,对于给定的一组组,每个组都包含一个小的索引子集,以及对于已知为组稀疏的真实信号向量的给定线性草图,因为它的支持包含在少量的联合中在这些组中,我们研究了仅通过其线性草图的知识就成功恢复真实信号的算法。我们为比最先进的更通用的组模型推导出模型投影复杂度结果和算法。我们考虑经典迭代硬阈值算法 (IHT) 的两个版本。经典版本迭代地计算向量在组模型上的精确投影,而近似版本 (AM-IHT) 迭代地使用头近似和尾近似。我们将这两种变体应用于分组模型并分析传感矩阵是高斯矩阵和模型扩展矩阵的两种情况。为了解决群模型上的精确投影问题,已知该问题等效于最大权重覆盖问题,我们使用基于动态规划和 Benders 分解的离散优化方法。头近似和尾近似分别由经典的贪婪方法和 LP 舍入得出。
更新日期:2020-06-13
down
wechat
bug