当前位置: X-MOL 学术arXiv.cs.IT › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparse linear regression -- CLuP achieves the ideal \emph{exact} ML
arXiv - CS - Information Theory Pub Date : 2020-11-23 , DOI: arxiv-2011.11550
Mihailo Stojnic

In this paper we revisit one of the classical statistical problems, the so-called sparse maximum-likelihood (ML) linear regression. As a way of attacking this type of regression, we present a novel CLuP mechanism that to a degree relies on the \bl{\textbf{Random Duality Theory (RDT)}} based algorithmic machinery that we recently introduced in \cite{Stojnicclupint19,Stojnicclupcmpl19,Stojnicclupplt19,Stojniccluplargesc20,Stojniccluprephased20}. After the initial success that the CLuP exhibited in achieving the exact ML performance while maintaining excellent computational complexity related properties in MIMO ML detection in \cite{Stojnicclupint19,Stojnicclupcmpl19,Stojnicclupplt19}, one would naturally expect that a similar type of success can be achieved in other ML considerations. The results that we present here confirm that such an expectation is indeed reasonable. In particular, within the sparse regression context, the introduced CLuP mechanism indeed turns out to be able to \bl{\textbf{\emph{achieve the ideal ML performance}}}. Moreover, it can substantially outperform some of the most prominent earlier state of the art algorithmic concepts, among them even the variants of the famous LASSO and SOCP from \cite{StojnicPrDepSocp10,StojnicGenLasso10,StojnicGenSocp10}. Also, our recent results presented in \cite{Stojniccluplargesc20,Stojniccluprephased20} showed that the CLuP has excellent \bl{\textbf{\emph{large-scale}}} and the so-called \bl{\textbf{\emph{rephasing}}} abilities. Since such large-scale algorithmic features are possibly even more desirable within the sparse regression context we here also demonstrate that the basic CLuP ideas can be reformulated to enable solving with a relative ease the regression problems with \bl{\textbf{\emph{several thousands}}} of unknowns.

中文翻译:

稀疏线性回归-CLuP实现理想的\ emph {exact} ML

在本文中,我们回顾了经典的统计问题之一,即所谓的稀疏最大似然(ML)线性回归。作为解决这种回归问题的一种方法,我们提出了一种新颖的CLuP机制,在一定程度上依赖于我们最近在\ cite {Stojnicclupint19中引入的基于\ bl {\ textbf {Random Duality Theory(RDT)}}的算法机制Stojnicclupcmpl19,Stojnicclupplt19,Stojniccluplargesc20,Stojniccluprephased20}。在CluP在实现精确的ML性能,同时在\ cite {Stojnicclupint19,Stojnicclupcmpl19,Stojnicclupplt19}中的MIMO ML检测中保持出色的计算复杂度相关属性方面表现出最初的成功之后,自然会期望在其他机器学习注意事项。我们在这里给出的结果证实了这种期望确实是合理的。特别是,在稀疏回归上下文中,引入的CLuP机制确实能够\ bl {\ textbf {\ emph {达到理想的ML性能}}}。此外,它可以大大胜过某些最先进的最新算法概念,其中甚至包括\ cite {StojnicPrDepSocp10,StojnicGenLasso10,StojnicGenSocp10}中著名的LASSO和SOCP的变体。另外,我们在\ cite {Stojniccluplargesc20,Stojniccluprephased20}中提出的最新结果表明,CLuP具有出色的\ bl {\ textbf {\ emph {large-scale}}}和所谓的\ bl {\ textbf {\ emph {rephasing }}} 能力。
更新日期:2020-11-25
down
wechat
bug