当前位置: X-MOL 学术Stat › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Model averaging-based sufficient dimension reduction
Stat ( IF 1.7 ) Pub Date : 2022-01-14 , DOI: 10.1002/sta4.458
Min Cai 1 , Ruige Zhuang 1 , Zhou Yu 1 , Ping Wu 1
Affiliation  

Sufficient dimension reduction is intended to project high-dimensional predictors onto a low-dimensional space without loss of information on the responses. Classical methods, such as sliced inverse regression, sliced average variance estimation and directional regression, are backbones of many modern sufficient dimension methods and have gained considerable research interests. However, the efficiency of such methods will be shrunk when dealing with sparse models. Under given models or some strict sparsity assumptions, there are existing sparse sufficient dimension methods in the literature. In order to relax the model assumptions and sparsity, in this paper, we define a general least squares objective function, which is applicable to all kernel matrices of classical sufficient dimension reduction methods, and propose a Mallows model averaging based sufficient dimension reduction method. Furthermore, an iterative least squares algorithm is used to obtain the sample estimates. Our method demonstrates excellent performance in simulation results.

中文翻译:

基于模型平均的充分降维

足够的降维旨在将高维预测器投影到低维空间,而不会丢失响应信息。经典方法,如切片逆回归、切片平均方差估计和方向回归,是许多现代足够维方法的支柱,并获得了相当大的研究兴趣。但是,在处理稀疏模型时,此类方法的效率会有所下降。在给定模型或一些严格的稀疏假设下,文献中已有稀疏的足够维方法。为了放宽模型假设和稀疏性,本文定义了一个通用的最小二乘目标函数,它适用于经典充分降维方法的所有核矩阵,并提出了一种基于 Mallows 模型平均的充分降维方法。此外,使用迭代最小二乘算法来获得样本估计值。我们的方法在模拟结果中表现出优异的性能。
更新日期:2022-01-14
down
wechat
bug