当前位置: X-MOL 学术Adv. Data Anal. Classif. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Adaptive sparse group LASSO in quantile regression
Advances in Data Analysis and Classification ( IF 1.6 ) Pub Date : 2020-07-29 , DOI: 10.1007/s11634-020-00413-8
Alvaro Mendez-Civieta , M. Carmen Aguilera-Morillo , Rosa E. Lillo

This paper studies the introduction of sparse group LASSO (SGL) to the quantile regression framework. Additionally, a more flexible version, an adaptive SGL is proposed based on the adaptive idea, this is, the usage of adaptive weights in the penalization. Adaptive estimators are usually focused on the study of the oracle property under asymptotic and double asymptotic frameworks. A key step on the demonstration of this property is to consider adaptive weights based on a initial \(\sqrt{n}\)-consistent estimator. In practice this implies the usage of a non penalized estimator that limits the adaptive solutions to low dimensional scenarios. In this work, several solutions, based on dimension reduction techniques PCA and PLS, are studied for the calculation of these weights in high dimensional frameworks. The benefits of this proposal are studied both in synthetic and real datasets.



中文翻译:

分位数回归中的自适应稀疏组LASSO

本文研究将稀疏组LASSO(SGL)引入分位数回归框架。此外,基于适应性思想提出了一种更灵活的适应性SGL,即在惩罚中使用适应性权重。自适应估计器通常专注于渐近和双重渐近框架下的oracle属性的研究。演示此属性的关键步骤是考虑基于初始\(\ sqrt {n} \)的自适应权重-一致的估计量。在实践中,这意味着使用非惩罚估计器,该估计器将自适应解限制为低维方案。在这项工作中,研究了基于降维技术PCA和PLS的几种解决方案,用于在高维框架中计算这些权重。在综合和真实数据集中都研究了该建议的好处。

更新日期:2020-07-30
down
wechat
bug