当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors.
Statistics and Computing ( IF 1.6 ) Pub Date : 2013-11-05 , DOI: 10.1007/s11222-013-9424-2
Patrick Breheny 1 , Jian Huang 2
Affiliation  

Penalized regression is an attractive framework for variable selection problems. Often, variables possess a grouping structure, and the relevant selection problem is that of selecting groups, not individual variables. The group lasso has been proposed as a way of extending the ideas of the lasso to the problem of group selection. Nonconvex penalties such as SCAD and MCP have been proposed and shown to have several advantages over the lasso; these penalties may also be extended to the group selection problem, giving rise to group SCAD and group MCP methods. Here, we describe algorithms for fitting these models stably and efficiently. In addition, we present simulation results and real data examples comparing and contrasting the statistical properties of these methods.

中文翻译:

具有分组预测变量的非凸惩罚线性和逻辑回归模型的组下降算法。

惩罚回归是解决变量选择问题的一个有吸引力的框架。通常,变量具有分组结构,而相关的选择问题是选择组的问题,而不是单个变量。已经提出了组套索,作为将套索的思想扩展到组选择问题的一种方式。已经提出了非凸罚分,例如SCAD和MCP,并显示出比套索具有多个优势。这些惩罚也可能扩展到组选择问题,从而产生组SCAD和组MCP方法。在这里,我们描述了稳定有效地拟合这些模型的算法。此外,我们提供了仿真结果和实际数据示例,用于比较和对比这些方法的统计特性。
更新日期:2013-11-05
down
wechat
bug