当前位置: X-MOL 学术IEEE Trans. Inform. Theory › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference
IEEE Transactions on Information Theory ( IF 2.2 ) Pub Date : 5-16-2022 , DOI: 10.1109/tit.2022.3175455
T Tony Cai 1 , Anru R Zhang 2, 3 , Yuchen Zhou 1, 3
Affiliation  

We study sparse group Lasso for high-dimensional double sparse linear regression, where the parameter of interest is simultaneously element-wise and group-wise sparse. This problem is an important instance of the simultaneously structured model – an actively studied topic in statistics and machine learning. In the noiseless case, matching upper and lower bounds on sample complexity are established for the exact recovery of sparse vectors and for stable estimation of approximately sparse vectors, respectively. In the noisy case, upper and matching minimax lower bounds for estimation error are obtained. We also consider the debiased sparse group Lasso and investigate its asymptotic property for the purpose of statistical inference. Finally, numerical studies are provided to support the theoretical results.

中文翻译:


稀疏组套索:最优样本复杂度、收敛率和统计推断



我们研究稀疏组套索用于高维双稀疏线性回归,其中感兴趣的参数同时是元素稀疏和分组稀疏。这个问题是同步结构化模型的一个重要实例——统计和机器学习中一个积极研究的主题。在无噪声情况下,建立样本复杂度的匹配上限和下限,分别用于稀疏向量的精确恢复和近似稀疏向量的稳定估计。在噪声情况下,获得估计误差的上限和匹配的最小最大下限。我们还考虑去偏稀疏群 Lasso 并研究其渐近性质以进行统计推断。最后,提供数值研究来支持理论结果。
更新日期:2024-08-26
down
wechat
bug