当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Rate-optimal refinement strategies for local approximation MCMC
Statistics and Computing ( IF 2.2 ) Pub Date : 2022-08-09 , DOI: 10.1007/s11222-022-10123-0
Andrew D. Davis , Youssef Marzouk , Aaron Smith , Natesh Pillai

Many Bayesian inference problems involve target distributions whose density functions are computationally expensive to evaluate. Replacing the target density with a local approximation based on a small number of carefully chosen density evaluations can significantly reduce the computational expense of Markov chain Monte Carlo (MCMC) sampling. Moreover, continual refinement of the local approximation can guarantee asymptotically exact sampling. We devise a new strategy for balancing the decay rate of the bias due to the approximation with that of the MCMC variance. We prove that the error of the resulting local approximation MCMC (LA-MCMC) algorithm decays at roughly the expected \(1/\sqrt{T}\) rate, and we demonstrate this rate numerically. We also introduce an algorithmic parameter that guarantees convergence given very weak tail bounds, significantly strengthening previous convergence results. Finally, we apply LA-MCMC to a computationally intensive Bayesian inverse problem arising in groundwater hydrology.



中文翻译:

局部近似 MCMC 的速率最优细化策略

许多贝叶斯推理问题涉及目标分布,其密度函数的评估计算成本很高。用基于少量精心选择的密度评估的局部近似替换目标密度可以显着降低马尔可夫链蒙特卡罗 (MCMC) 采样的计算开销。此外,局部近似的不断细化可以保证渐近精确的采样。我们设计了一种新的策略来平衡由于近似而导致的偏差衰减率与 MCMC 方差的衰减率。我们证明了得到的局部近似 MCMC (LA-MCMC) 算法的误差在大致预期的\(1/\sqrt{T}\)处衰减率,我们用数字证明了这个率。我们还引入了一个算法参数,在非常弱的尾边界下保证收敛,显着加强了之前的收敛结果。最后,我们将 LA-MCMC 应用于地下水水文学中出现的计算密集型贝叶斯逆问题。

更新日期:2022-08-09
down
wechat
bug