当前位置: X-MOL 学术Comput. Stat. Data Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Computation of projection regression depth and its induced median
Computational Statistics & Data Analysis ( IF 1.5 ) Pub Date : 2021-01-27 , DOI: 10.1016/j.csda.2021.107184
Yijun Zuo

Notions of depth in regression have been introduced and studied in the literature. The most famous example is Regression Depth (RD), which is a direct extension of location depth to regression. The projection regression depth (PRD) is the extension of another prevailing location depth, the projection depth, to regression. The computation issues of the RD have been discussed in the literature. The computation issues of the PRD and its induced median (maximum depth estimator) in a regression setting, never considered before, are addressed. For a given βRp exact algorithms for the PRD with cost O(n2logn) (p=2) and O(N(n,p)(p3+nlogn+np1.5+npNIter)) (p>2) and approximate algorithms for the PRD and its induced median with cost respectively O(Nvnp) and O(RpNβ(p2+nNvNIter)) are proposed. Here N(n,p) is a number defined based on the total number of (p1) dimensional hyperplanes formed by points induced from sample points and the β; Nv is the total number of unit directions v utilized; Nβ is the total number of candidate regression parameters β employed; NIter is the total number of iterations carried out in an optimization algorithm; R is the total number of replications. Furthermore, as the second major contribution, three PRD induced estimators, which can be computed up to 30 times faster than that of the PRD induced median while maintaining a similar level of accuracy are introduced. Examples and simulation studies reveal that the depth median induced from the PRD is favorable in terms of robustness and efficiency, compared to the maximum depth estimator induced from the RD, which is the current leading regression median.



中文翻译:

投影回归深度及其中位数的计算

回归中的深度概念已在文献中引入和研究。最著名的示例是回归深度(RD),它是将位置深度直接扩展到回归。投影回归深度(PRD)是另一种流行的位置深度(投影深度)对回归的扩展。文献中已经讨论了RD的计算问题。解决了以前从未考虑过的PRD及其在回归设置中的中位数(最大深度估计量)的计算问题。对于给定β[Rp 带有成本的珠三角的精确算法 Øñ2日志ñp=2)和 Øññpp3+ñ日志ñ+ñp1个5+ñpñ一世ŤË[Rp>2)和近似算法,分别针对PRD及其诱导的成本中位数 ØñvñpØ[Rpñβp2+ññvñ一世ŤË[R被提议。这里ññp 是根据的总数定义的数字 p-1个 由样本点和 β; ñv 是单位方向的总数 v 利用 ñβ 是候选回归参数的总数 β 受雇 ñ一世ŤË[R 是优化算法中执行的迭代总数; [R是复制的总数。此外,作为第二个主要贡献,引入了三个PRD诱导的估计量,它们的计算速度可比PRD诱导的中位数快30倍,同时保持相似的准确性。实例和仿真研究表明,与当前主要的回归中位数RD所引起的最大深度估计量相比,PRD所引起的深度中位数在鲁棒性和效率方面是有利的。

更新日期:2021-02-08
down
wechat
bug