当前位置: X-MOL 学术J. Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Limiting behavior of derivative approximation techniques as the number of points tends to infinity on a fixed interval in R
Journal of Computational and Applied Mathematics ( IF 2.4 ) Pub Date : 2020-10-02 , DOI: 10.1016/j.cam.2020.113218
Phillip Braun , Warren Hare , Gabriel Jarry-Bolduc

We consider the question of numerically approximating the derivative of a smooth function using only function evaluations. In particular, we examine the regression gradient, the generalized simplex gradient and the generalized centered simplex gradient, three numerical techniques based on using function values at a collection of sample points to construct ‘best-fit’ linear models. Under some conditions, these gradient approximations have error bounds dependent on the number of sample points used, the Lipschitz constant of the true gradient, and the geometry of the sample set. Perhaps counter-intuitively, as the number of sample points increases (to infinity) on a fixed domain, the error bounds can increase (to infinity). In this work, we first explore the behavior of the error bound for generalized simplex gradients of a single-variable function (f:RR). Thereafter, we investigate the behavior of the absolute error for these three gradient approximation techniques as the number of sample points tends to infinity. Under reasonable assumptions, we prove that the absolute errors remain bounded as the number of sample points increases to infinity on a fixed interval.



中文翻译:

导数逼近技术的极限行为,因为点数趋于在固定间隔内无穷大 [R

我们考虑仅使用函数评估在数值上逼近平滑函数的导数的问题。特别是,我们研究了回归梯度广义单纯形梯度广义中心单纯形梯度,三种数值技术,它们基于在一组采样点处使用函数值来构建“最佳拟合”线性模型。在某些情况下,这些梯度近似值的误差范围取决于所用采样点的数量,真实梯度的Lipschitz常数以及样本集的几何形状。可能与直觉相反,随着固定域上采样点数量的增加(至无穷大),误差范围会增加(至无穷大)。在这项工作中,我们首先探讨单变量函数的广义单纯形梯度的误差界的行为(F[R[R)。此后,随着样本点数量趋于无穷大,我们研究了这三种梯度近似技术的绝对误差行为。在合理的假设下,我们证明了绝对误差在固定的间隔内随着采样点数量增加到无穷大而仍然是有限的。

更新日期:2020-10-16
down
wechat
bug