当前位置: X-MOL 学术J. Stat. Plann. Inference › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A brief review of linear sufficient dimension reduction through optimization
Journal of Statistical Planning and Inference ( IF 0.8 ) Pub Date : 2021-03-01 , DOI: 10.1016/j.jspi.2020.06.006
Yuexiao Dong

Abstract In this paper, we review three families of methods in linear sufficient dimension reduction through optimization. Through minimization of general loss functions, we cast classical methods, such as ordinary least squares and sliced inverse regression, and modern methods, such as principal support vector machines and principal quantile regression, under a unified framework. Then we review sufficient dimension reduction methods through maximizing dependence measures, which include the distance covariance, the Hilbert–Schmidt independence criterion, the martingale difference divergence, and the expected conditional difference. Last but not least, we provide an information-theoretic perspective for the third family of sufficient dimension reduction methods.

中文翻译:

通过优化进行线性充分降维的简要回顾

摘要 在本文中,我们回顾了通过优化进行线性充分降维的三类方法。通过最小化一般损失函数,我们将经典方法(如普通最小二乘法和切片逆回归)和现代方法(如主支持向量机和主分位数回归)置于统一框架下。然后我们通过最大化依赖度量来回顾足够的降维方法,其中包括距离协方差、希尔伯特-施密特独立标准、鞅差异散度和预期条件差异。最后但并非最不重要的一点是,我们为第三类充分降维方法提供了信息论观点。
更新日期:2021-03-01
down
wechat
bug