当前位置: X-MOL 学术Int. J Comput. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stochastic algorithmic differentiation of (expectations of) discontinuous functions (indicator functions)
International Journal of Computer Mathematics ( IF 1.8 ) Pub Date : 2021-02-12 , DOI: 10.1080/00207160.2021.1883593
Christian P. Fries 1
Affiliation  

In this paper, we present a method for the accurate estimation of the derivative (aka. sensitivity) of expectations of functions involving an indicator function by modifying a (stochastic) algorithmic differentiation, replacing the derivative of the indicator function with a suitable operator. We show that we can split this operator into a conditional expectation operator and the density. This allows using different or improved numerical approximation methods for these operators, e.g. regression. The method is an improvement of the approach presented in C.P. Fries [Automatic backward differentiation for American Monte-Carlo algorithms (conditional expectation). Risk, April, 2018; Stochastic automatic differentiation: Automatic differentiation for Monte-Carlo simulations, Quant. Finance 19(6) (2019), pp. 1043–1059]. The finite difference approximation of a partial derivative of a Monte-Carlo integral of a discontinuous function is known to exhibit a high Monte-Carlo error. The issue is evident since the Monte-Carlo approximation of a discontinuous function is just a finite sum of discontinuous functions and as such, not even differentiable. The algorithmic differentiation of a discontinuous function is problematic. A natural approach is to replace the discontinuity by continuous functions. This is equivalent to replacing a path-wise automatic differentiation by a (local) finite difference approximation. The decoupling of the integration of the Dirac delta and the remaining conditional expectation introduced here results in an improvement in terms of variance reduction and implementation design.



中文翻译:

不连续函数(指标函数)(预期)的随机算法微分

在本文中,我们提出了一种通过修改(随机)算法微分,用合适的算子替换指标函数的导数来准确估计涉及指标函数的函数期望的导数(又名灵敏度)的方法。我们表明我们可以将此算子拆分为条件期望算子和密度。这允许对这些算子使用不同的或改进的数值逼近方法,例如回归。该方法是对 CP Fries [美国蒙特卡洛算法的自动反向微分(条件期望)中提出的方法的改进。风险,2018 年 4 月;随机自动微分:蒙特卡洛模拟的自动微分, 量。财务 19(6) (2019),第 1043-1059 页]。已知不连续函数的蒙特卡洛积分的偏导数的有限差分近似表现出高蒙特卡洛误差。这个问题很明显,因为不连续函数的蒙特卡罗近似只是不连续函数的有限和,因此甚至不可微。不连续函数的算法微分是有问题的。一种自然的方法是用连续函数代替不连续性。这等效于用(局部)有限差分近似代替路径自动微分。Dirac delta 的积分与此处引入的剩余条件期望的解耦导致在方差减少和实现设计方面的改进。

更新日期:2021-02-12
down
wechat
bug