当前位置: X-MOL 学术Comput. Stat. Data Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Linearly preconditioned nonlinear conjugate gradient acceleration of the PX-EM algorithm
Computational Statistics & Data Analysis ( IF 1.5 ) Pub Date : 2021-03-01 , DOI: 10.1016/j.csda.2020.107056
Lin Zhou , Yayong Tang

Abstract The EM algorithm is a widely applicable algorithm for modal estimation but often criticized for its slow convergence. A new hybrid accelerator named APX-EM is proposed for speeding up the convergence of EM algorithm, which is based on both Linearly Preconditioned Nonlinear Conjugate Gradient (PNCG) and PX-EM algorithm. The intuitive idea is that, each step of the PX-EM algorithm can be viewed approximately as a generalized gradient just like the EM algorithm, then the linearly PNCG method can be used to accelerate the EM algorithm. Essentially, this method is an adjustment of the AEM algorithm, and it usually achieves a faster convergence rate than the AEM algorithm by sacrificing a little simplicity. The convergence of the APX-EM algorithm, includes a global convergence result for this method under suitable conditions, is discussed. This method is illustrated for factor analysis and a random-effects model.

中文翻译:

PX-EM算法的线性预处理非线性共轭梯度加速

摘要 EM算法是一种广泛适用的模态估计算法,但因其收敛速度慢而饱受诟病。为了加速EM算法的收敛,提出了一种新的混合加速器APX-EM,该加速器基于线性预处理非线性共轭梯度(PNCG)和PX-EM算法。直观的想法是,PX-EM 算法的每一步都可以近似地看作是一个广义梯度,就像 EM 算法一样,然后可以使用线性 PNCG 方法来加速 EM 算法。这种方法本质上是对AEM算法的一种调整,通常通过牺牲一点简单性来达到比AEM算法更快的收敛速度。讨论了 APX-EM 算法的收敛性,包括该方法在合适条件下的全局收敛结果。
更新日期:2021-03-01
down
wechat
bug