当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Extended variational inference for gamma mixture model in positive vectors modeling
Neurocomputing ( IF 6 ) Pub Date : 2021-01-11 , DOI: 10.1016/j.neucom.2020.12.042
Yuping Lai , Huirui Cao , Lijuan Luo , Yongmei Zhang , Fukun Bi , Xiaolin Gui , Yuan Ping

Bayesian estimation of finite Gamma mixture model (GaMM) has attracted considerable attention recently due to its capability of modeling positive data. With conventional variational inference (VI) frameworks, we cannot derive an analytically tractable solution for the variational posterior, since the expectation of the joint distribution of all the random variables cannot be estimated in a closed form. Therefore, numerical techniques are commonly utilized to simulate the posterior distribution. However, the optimization process of these methods can be prohibitively slow for practical applications. In order to obtain closed-form solutions, some lower-bound approximations are then introduced into the evidence lower bound (ELBO), following the recently proposed extended variational inference (EVI) framework. The problem in numerical simulation can be overcome. In this paper, we address the Bayesian estimation of the finite Gamma mixture model (GaMM) under the EVI framework in a flexible way. Moreover, the optimal mixture component number can be automatically determined based on the observed data and the over-fitting problem related to the conventional expectation–maximization (EM) is overcome. We demonstrate the excellent performance of the proposed method with synthesized data and real data evaluations. In the real data evaluation, we compare the proposed method on object detection and image categorization tasks with referred methods and find statistically significant improvement on accuracies and runtime.



中文翻译:

正矢量建模中伽玛混合模型的扩展变分推理

有限伽玛混合模型(GaMM)的贝叶斯估计由于具有建模正数据的能力,最近引起了相当大的关注。使用传统的变分推论(VI)框架,由于无法以封闭形式估算所有随机变量的联合分布的期望,因此无法为变分后验导出解析可解决的解决方案。因此,通常使用数值技术来模拟后验分布。但是,对于实际应用而言,这些方法的优化过程可能会过慢。为了获得封闭形式的解决方案,遵循最近提出的扩展变分推论(EVI)框架,将一些下界近似值引入证据下界(ELBO)。数值模拟中的问题可以克服。在本文中,我们以一种灵活的方式解决了EVI框架下有限伽玛混合模型(GaMM)的贝叶斯估计。此外,可以根据观察到的数据自动确定最佳混合物组分数,从而克服了与传统期望最大化(EM)相关的过拟合问题。我们通过综合数据和真实数据评估证明了所提出方法的出色性能。在实际数据评估中,我们将所提出的目标检测和图像分类任务方法与参考方法进行了比较,发现在准确性和运行时间方面有统计学意义的改进。此外,可以根据观察到的数据自动确定最佳混合物组分数,从而克服了与传统期望最大化(EM)相关的过拟合问题。我们通过综合数据和真实数据评估证明了所提出方法的出色性能。在实际数据评估中,我们将所提出的目标检测和图像分类任务方法与参考方法进行了比较,发现在准确性和运行时间方面有统计学意义的改进。此外,可以根据观察到的数据自动确定最佳混合物组分数,从而克服了与传统期望最大化(EM)相关的过拟合问题。我们通过综合数据和真实数据评估证明了所提出方法的出色性能。在实际数据评估中,我们将所提出的目标检测和图像分类任务方法与参考方法进行了比较,发现在准确性和运行时间方面有统计学意义的改进。

更新日期:2021-01-12
down
wechat
bug