当前位置: X-MOL 学术Comput. Stat. Data Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Generalized kernel-based inverse regression methods for sufficient dimension reduction
Computational Statistics & Data Analysis ( IF 1.8 ) Pub Date : 2020-10-01 , DOI: 10.1016/j.csda.2020.106995
Chuanlong Xie , Lixing Zhu

Abstract The linearity condition and the constant conditional variance assumption popularly used in sufficient dimension reduction are respectively close to elliptical symmetry and normality. However, it is always the concern about their restrictiveness. In this article, we give systematic studies to provide insight into the reasons why the popularly used sliced inverse regression and sliced average variance estimation need these conditions. Then we propose a new framework to relax these conditions and suggest generalized kernel-based inverse regression methods to handle a class of mixture multivariate unified skew-elliptical distributions.

中文翻译:

用于充分降维的基于广义核的逆回归方法

摘要 充分降维常用的线性条件和常条件方差假设分别接近椭圆对称性和正态性。然而,人们总是担心它们的限制性。在本文中,我们进行了系统的研究,以深入了解普遍使用的切片逆回归和切片平均方差估计需要这些条件的原因。然后我们提出了一个新的框架来放宽这些条件,并提出基于广义核的逆回归方法来处理一类混合多元统一偏斜椭圆分布。
更新日期:2020-10-01
down
wechat
bug