当前位置: X-MOL 学术J. Comput. Graph. Stat. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
ConvexLAR: An Extension of Least Angle Regression
Journal of Computational and Graphical Statistics ( IF 2.4 ) Pub Date : 2015-07-03 , DOI: 10.1080/10618600.2014.962700
Wei Xiao 1 , Yichao Wu 1 , Hua Zhou 1
Affiliation  

The least angle regression (LAR) was proposed by Efron, Hastie, Johnstone and Tibshirani in the year 2004 for continuous model selection in linear regression. It is motivated by a geometric argument and tracks a path along which the predictors enter successively and the active predictors always maintain the same absolute correlation (angle) with the residual vector. Although it gains popularity quickly, its extensions seem rare compared to the penalty methods. In this expository article, we show that the powerful geometric idea of LAR can be generalized in a fruitful way. We propose a ConvexLAR algorithm that works for any convex loss function and naturally extends to group selection and data adaptive variable selection. After simple modification, it also yields new exact path algorithms for certain penalty methods such as a convex loss function with lasso or group lasso penalty. Variable selection in recurrent event and panel count data analysis, Ada-Boost, and Gaussian graphical model is reconsidered from the ConvexLAR angle. Supplementary materials for this article are available online.

中文翻译:

ConvexLAR:最小角度回归的扩展

最小角回归 (LAR) 是由 Efron、Hastie、Johnstone 和 Tibshirani 在 2004 年提出的,用于线性回归中的连续模型选择。它由几何参数激发并跟踪预测变量连续进入的路径,并且活动预测变量始终与残差向量保持相同的绝对相关性(角度)。尽管它迅速流行起来,但与惩罚方法相比,它的扩展似乎很少见。在这篇说明性文章中,我们展示了 LAR 强大的几何思想可以以富有成效的方式进行推广。我们提出了一种适用于任何凸损失函数的 ConvexLAR 算法,并自然地扩展到组选择和数据自适应变量选择。简单修改后,它还为某些惩罚方法产生了新的精确路径算法,例如带有套索或组套索惩罚的凸损失函数。从 ConvexLAR 角度重新考虑复发事件和面板计数数据分析、Ada-Boost 和高斯图形模型中的变量选择。本文的补充材料可在线获取。
更新日期:2015-07-03
down
wechat
bug