当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Distributed, partially collapsed MCMC for Bayesian Nonparametrics
arXiv - CS - Machine Learning Pub Date : 2020-01-15 , DOI: arxiv-2001.05591
Avinava Dubey, Michael Minyi Zhang, Eric P. Xing, Sinead A. Williamson

Bayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow. We exploit the fact that completely random measures, which commonly used models like the Dirichlet process and the beta-Bernoulli process can be expressed as, are decomposable into independent sub-measures. We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components. We then select different inference algorithms for the two components: uncollapsed samplers mix well on the finite measure, while collapsed samplers mix well on the infinite, sparsely occupied tail. The resulting hybrid algorithm can be applied to a wide class of models, and can be easily distributed to allow scalable inference without sacrificing asymptotic convergence guarantees.

中文翻译:

用于贝叶斯非参数的分布式、部分折叠的 MCMC

贝叶斯非参数 (BNP) 模型为发现数据集中的潜在潜在特征提供了优雅的方法,但此类模型中的推理速度可能很慢。我们利用完全随机度量(Dirichlet process 和 beta-Bernoulli process 等常用模型可以表示为)可分解为独立子度量的事实。我们使用这种分解将潜在度量划分为仅包含实例化组件的有限度量和包含所有其他组件的无限度量。然后我们为两个组件选择不同的推理算法:未折叠采样器在有限度量上混合良好,而折叠采样器在无限、稀疏占用的尾部混合良好。由此产生的混合算法可以应用于多种模型,
更新日期:2020-07-17
down
wechat
bug