当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Properties of the stochastic approximation EM algorithm with mini-batch sampling
Statistics and Computing ( IF 2.2 ) Pub Date : 2020-09-05 , DOI: 10.1007/s11222-020-09968-0
Estelle Kuhn , Catherine Matias , Tabea Rebafka

To deal with very large datasets a mini-batch version of the Monte Carlo Markov Chain Stochastic Approximation Expectation–Maximization algorithm for general latent variable models is proposed. For exponential models the algorithm is shown to be convergent under classical conditions as the number of iterations increases. Numerical experiments illustrate the performance of the mini-batch algorithm in various models. In particular, we highlight that mini-batch sampling results in an important speed-up of the convergence of the sequence of estimators generated by the algorithm. Moreover, insights on the effect of the mini-batch size on the limit distribution are presented. Finally, we illustrate how to use mini-batch sampling in practice to improve results when a constraint on the computing time is given.



中文翻译:

小批量抽样的随机近似EM算法的性质

为了处理非常大的数据集,提出了用于一般潜在变量模型的蒙特卡洛马尔可夫链随机近似期望-最大化算法的小批量版本。对于指数模型,随着迭代次数的增加,该算法在经典条件下表现出收敛性。数值实验说明了迷你批处理算法在各种模型中的性能。特别是,我们着重指出,小批量采样可显着加快算法生成的估算器序列收敛的速度。此外,还介绍了有关小批量大小对极限分布的影响的见解。最后,我们说明了在给出计算时间约束的情况下,如何在实际中使用小批量采样来改善结果。

更新日期:2020-09-07
down
wechat
bug