当前位置: X-MOL 学术Biometrika › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Nonparametric regression with adaptive truncation via a convex hierarchical penalty
Biometrika ( IF 2.4 ) Pub Date : 2018-12-13 , DOI: 10.1093/biomet/asy056
Asad Haris 1 , Ali Shojaie 1 , Noah Simon 1
Affiliation  

&NA; We consider the problem of nonparametric regression with a potentially large number of covariates. We propose a convex, penalized estimation framework that is particularly well suited to high‐dimensional sparse additive models and combines the appealing features of finite basis representation and smoothing penalties. In the case of additive models, a finite basis representation provides a parsimonious representation for fitted functions but is not adaptive when component functions possess different levels of complexity. In contrast, a smoothing spline‐type penalty on the component functions is adaptive but does not provide a parsimonious representation. Our proposal simultaneously achieves parsimony and adaptivity in a computationally efficient way. We demonstrate these properties through empirical studies and show that our estimator converges at the minimax rate for functions within a hierarchical class. We further establish minimax rates for a large class of sparse additive models. We also develop an efficient algorithm that scales similarly to the lasso with the number of covariates and sample size.

中文翻译:

通过凸层次惩罚进行自适应截断的非参数回归

&NA; 我们考虑具有潜在大量协变量的非参数回归问题。我们提出了一个凸的惩罚估计框架,它特别适合高维稀疏加法模型,并结合了有限基表示和平滑惩罚的吸引人的特征。在加法模型的情况下,有限基表示为拟合函数提供了简约的表示,但当组件函数具有不同级别的复杂性时,它是不自适应的。相比之下,分量函数上的平滑样条型惩罚是自适应的,但不提供简约的表示。我们的提议以计算高效的方式同时实现了简约性和适应性。我们通过实证研究证明了这些特性,并表明我们的估计器在分层类中的函数中以极小极大速率收敛。我们进一步为一大类稀疏加法模型建立了极大极小率。我们还开发了一种有效的算法,该算法与 lasso 类似,具有协变量数量和样本大小。
更新日期:2018-12-13
down
wechat
bug