当前位置: X-MOL 学术J. Comput. Graph. Stat. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Penalized Nonparametric Scalar-on-Function Regression via Principal Coordinates
Journal of Computational and Graphical Statistics ( IF 2.4 ) Pub Date : 2017-04-11 , DOI: 10.1080/10618600.2016.1217227
Philip T Reiss 1 , David L Miller 2 , Pei-Shien Wu 3 , Wen-Yu Hua 3
Affiliation  

Abstract A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This article introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. In the proposed method, which we call principal coordinate ridge regression, one regresses the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, principal coordinate ridge regression, with dynamic time warping distance used to define the principal coordinates, is shown to outperform a functional generalized linear model. Supplementary materials for this article are available online.

中文翻译:

通过主坐标惩罚非参数函数标量回归

摘要 许多非参数回归的经典方法最近已扩展到函数预测变量的情况。本文介绍了这种类型的一种新方法,它将中阶惩罚平滑扩展到函数标量回归。在所提出的方法中,我们称之为主坐标岭回归,在应用岭惩罚的同时,对由功能预测变量之间的相关距离定义的前导主坐标的响应进行回归。我们基于通用加性建模软件的公开实现允许快速优化调整参数选择并扩展到多个函数预测器、指数族值响应和混合效应模型。在签名验证数据的应用中,主坐标岭回归,使用用于定义主坐标的动态时间扭曲距离,显示优于函数广义线性模型。本文的补充材料可在线获取。
更新日期:2017-04-11
down
wechat
bug