当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Conditionally structured variational Gaussian approximation with importance weights
Statistics and Computing ( IF 1.6 ) Pub Date : 2020-04-28 , DOI: 10.1007/s11222-020-09944-8
Linda S. L. Tan , Aishwarya Bhaskaran , David J. Nott

We develop flexible methods of deriving variational inference for models with complex latent variable structure. By splitting the variables in these models into “global” parameters and “local” latent variables, we define a class of variational approximations that exploit this partitioning and go beyond Gaussian variational approximation. This approximation is motivated by the fact that in many hierarchical models, there are global variance parameters which determine the scale of local latent variables in their posterior conditional on the global parameters. We also consider parsimonious parametrizations by using conditional independence structure and improved estimation of the log marginal likelihood and variational density using importance weights. These methods are shown to improve significantly on Gaussian variational approximation methods for a similar computational cost. Application of the methodology is illustrated using generalized linear mixed models and state space models.

中文翻译:

具有重要权重的条件结构变分高斯近似

我们为具有潜在隐变量结构的模型开发了灵活的方法来推导变分推理。通过将这些模型中的变量分为“全局”参数和“局部”潜在变量,我们定义了一类变分逼近,利用了这种划分并且超越了高斯变分逼近。这种近似是由以下事实推动的:在许多层次模型中,存在全局方差参数,这些参数确定在全局条件下其后验条件的局部潜在变量的规模。我们还考虑了使用条件独立性结构的简约参数化,以及使用重要性权重改进对数边际可能性和变异密度的估计。这些方法显示出与高斯变分近似方法相比有明显的改进,且计算成本相似。使用广义线性混合模型和状态空间模型说明了该方法的应用。
更新日期:2020-04-28
down
wechat
bug