当前位置: X-MOL 学术Stat. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An adaptively weighted stochastic gradient MCMC algorithm for Monte Carlo simulation and global optimization
Statistics and Computing ( IF 2.2 ) Pub Date : 2022-07-09 , DOI: 10.1007/s11222-022-10120-3
Wei Deng , Guang Lin , Faming Liang

We propose an adaptively weighted stochastic gradient Langevin dynamics (AWSGLD) algorithm for Bayesian learning of big data problems. The proposed algorithm is scalable and possesses a self-adjusting mechanism: It adaptively flattens the high-energy region and protrudes the low-energy region during simulations such that both Monte Carlo simulation and global optimization tasks can be greatly facilitated in a single run. The self-adjusting mechanism enables the proposed algorithm to be essentially immune to local traps. Theoretically, by showing the stability of the mean-field system and verifying the existence and regularity properties of the solution of Poisson equation, we establish the convergence of the AWSGLD algorithm, including both the convergence of the self-adapting parameters and the convergence of the weighted averaging estimators. Empirically, the AWSGLD algorithm is tested on multiple benchmark datasets including CIFAR100 and SVHN for both optimization and uncertainty estimation tasks. The numerical results indicate its great potential in Monte Carlo simulation and global optimization for modern machine learning tasks.



中文翻译:

一种用于蒙特卡罗模拟和全局优化的自适应加权随机梯度 MCMC 算法

我们提出了一种自适应加权随机梯度朗之万动力学(AWSGLD)算法,用于大数据问题的贝叶斯学习。所提出的算法具有可扩展性并具有自调整机制:它在模拟过程中自适应地展平高能区域并突出低能区域,从而可以在一次运行中极大地促进蒙特卡罗模拟和全局优化任务。自调整机制使所提出的算法基本上不受局部陷阱的影响。理论上,通过展示平均场系统的稳定性和验证泊松方程解的存在性和规律性,我们建立了AWSGLD算法的收敛性,包括自适应参数的收敛性和参数的收敛性。加权平均估计量。根据经验,AWSGLD 算法在多个基准数据集上进行了测试,包括 CIFAR100 和 SVHN,用于优化和不确定性估计任务。数值结果表明它在蒙特卡洛模拟和现代机器学习任务的全局优化方面的巨大潜力。

更新日期:2022-07-10
down
wechat
bug