当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Particle-based energetic variational inference
Statistics and Computing ( IF 2.2 ) Pub Date : 2021-04-17 , DOI: 10.1007/s11222-021-10009-7
Yiwei Wang , Jiuhai Chen , Chun Liu , Lulu Kang

We introduce a new variational inference (VI) framework, called energetic variational inference (EVI). It minimizes the VI objective function based on a prescribed energy-dissipation law. Using the EVI framework, we can derive many existing particle-based variational inference (ParVI) methods, including the popular Stein variational gradient descent (SVGD). More importantly, many new ParVI schemes can be created under this framework. For illustration, we propose a new particle-based EVI scheme, which performs the particle-based approximation of the density first and then uses the approximated density in the variational procedure, or “Approximation-then-Variation” for short. Thanks to this order of approximation and variation, the new scheme can maintain the variational structure at the particle level, and can significantly decrease the KL-divergence in each iteration. Numerical experiments show the proposed method outperforms some existing ParVI methods in terms of fidelity to the target distribution.



中文翻译:

基于粒子的能量变分推断

我们介绍了一个新的变分推理(VI)框架,称为能量变分推理(EVI)。它根据规定的能量耗散定律使VI目标函数最小化。使用EVI框架,我们可以派生许多现有的基于粒子的变分推断(ParVI)方法,包括流行的Stein变分梯度下降(SVGD)。更重要的是,可以在此框架下创建许多新的ParVI方案。为了说明,我们提出了一种新的基于粒子的EVI方案,该方案首先执行基于粒子的密度近似,然后在变分过程中简称为“近似-然后-变异”。由于这种近似和变化的顺序,新方案可以将变化结构保持在粒子级别,并且可以显着降低每次迭代中的KL发散。数值实验表明,该方法在保真度上优于目标ParVI方法。

更新日期:2021-04-18
down
wechat
bug