当前位置: X-MOL 学术Int. J Comput. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stochastic Algorithmic Differentiation of (Expectations of) Discontinuous Functions (Indicator Functions)
International Journal of Computer Mathematics ( IF 1.8 ) Pub Date : 2021-02-08
Christian P. Fries

In this paper, we present a method for the accurate estimation of the derivative (aka. sensitivity) of expectations of functions involving an indicator function by modifying a (stochastic) algorithmic differentiation, replacing the derivative of the indicator function with a suitable operator. We show that we can split this operator into a conditional expectation operator and the density. This allows using different or improved numerical approximation methods for these operators, e.g., regression. The method is an improvement of the approach presented in [11,13]. The finite difference approximation of a partial derivative of a Monte-Carlo integral of a discontinuous function is known to exhibit a high Monte-Carlo error. The issue is evident since the Monte-Carlo approximation of a discontinuous function is just a finite sum of discontinuous functions and as such, not even differentiable. The algorithmic differentiation of a discontinuous function is problematic. A natural approach is to replace the discontinuity by continuous functions. This is equivalent to replacing a path-wise automatic differentiation by a (local) finite difference approximation. The decoupling of the integration of the Dirac delta and the remaining conditional expectation introduced here results in an improvement in terms of variance reduction. The method can be implemented by a local modification of the algorithmic differentiation.

更新日期:2021-02-08
down
wechat
bug