当前位置: X-MOL 学术Ann. Inst. Stat. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparse and efficient estimation for partial spline models with increasing dimension
Annals of the Institute of Statistical Mathematics ( IF 1 ) Pub Date : 2013-12-15 , DOI: 10.1007/s10463-013-0440-y
Guang Cheng 1 , Hao Helen Zhang 1 , Zuofeng Shang 1
Affiliation  

We consider model selection and estimation for partial spline models and propose a new regularization method in the context of smoothing splines. The regularization method has a simple yet elegant form, consisting of roughness penalty on the nonparametric component and shrinkage penalty on the parametric components, which can achieve function smoothing and sparse estimation simultaneously. We establish the convergence rate and oracle properties of the estimator under weak regularity conditions. Remarkably, the estimated parametric components are sparse and efficient, and the nonparametric component can be estimated with the optimal rate. The procedure also has attractive computational properties. Using the representer theory of smoothing splines, we reformulate the objective function as a LASSO-type problem, enabling us to use the LARS algorithm to compute the solution path. We then extend the procedure to situations when the number of predictors increases with the sample size and investigate its asymptotic properties in that context. Finite-sample performance is illustrated by simulations.

中文翻译:

增加维数的部分样条模型的稀疏有效估计

我们考虑了部分样条模型的模型选择和估计,并在平滑样条的背景下提出了一种新的正则化方法。正则化方法具有简单而优雅的形式,由对非参数分量的粗糙度惩罚和对参数分量的收缩惩罚组成,可以同时实现函数平滑和稀疏估计。我们建立了弱正则条件下估计器的收敛速度和预言机性质。值得注意的是,估计的参数分量是稀疏且有效的,非参数分量可以以最优速率估计。该过程还具有有吸引力的计算特性。使用平滑样条的代表理论,我们将目标函数重新表述为 LASSO 类型的问题,使我们能够使用 LARS 算法来计算解决方案路径。然后,我们将该过程扩展到预测变量的数量随着样本大小而增加的情况,并在该上下文中研究其渐近特性。有限样本性能通过模拟来说明。
更新日期:2013-12-15
down
wechat
bug