当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Linear Regression without Correspondences via Concave Minimization
arXiv - CS - Machine Learning Pub Date : 2020-03-17 , DOI: arxiv-2003.07706
Liangzu Peng and Manolis C. Tsakiris

Linear regression without correspondences concerns the recovery of a signal in the linear regression setting, where the correspondences between the observations and the linear functionals are unknown. The associated maximum likelihood function is NP-hard to compute when the signal has dimension larger than one. To optimize this objective function we reformulate it as a concave minimization problem, which we solve via branch-and-bound. This is supported by a computable search space to branch, an effective lower bounding scheme via convex envelope minimization and a refined upper bound, all naturally arising from the concave minimization reformulation. The resulting algorithm outperforms state-of-the-art methods for fully shuffled data and remains tractable for up to $8$-dimensional signals, an untouched regime in prior work.

中文翻译:

通过凹面最小化的无对应关系的线性回归

没有对应关系的线性回归涉及线性回归设置中信号的恢复,其中观察值和线性函数之间的对应关系是未知的。当信号的维度大于 1 时,关联的最大似然函数是 NP 难计算的。为了优化这个目标函数,我们将它重新表述为一个凹面最小化问题,我们通过分支定界来解决这个问题。这得到了可计算的分支搜索空间、通过凸包络最小化的有效下界方案和精细上界的支持,所有这些都自然产生于凹面最小化重构。由此产生的算法在完全混洗数据方面优于最先进的方法,并且对于高达 8 美元的维信号仍然易于处理,这是先前工作中未受影响的制度。
更新日期:2020-09-15
down
wechat
bug