当前位置: X-MOL 学术Phys. Rev. X › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Marvels and pitfalls of the Langevin algorithm in noisy high-dimensional inference
Physical Review X ( IF 11.6 ) Pub Date : 
Stefano Sarao Mannelli, Giulio Biroli, Chiara Cammarota, Florent Krzakala, Pierfrancesco Urbani, and Lenka Zdeborová

Gradient-descent-based algorithms and their stochastic versions have widespread applications in machine learning and statistical inference. In this work we perform an analytic study of the performances of the one most commonly considered in physics, the Langevin algorithm, in the context of noisy high-dimensional inference. We employ the Langevin algorithm to sample the posterior probability measure for the spiked matrix-tensor model. The typical behaviour of this algorithm is described by a system of integro-differential equations that we call the Langevin state evolution, whose solution is compared with the one of the state evolution of approximate message passing (AMP). Our results show that, remarkably, the algorithmic threshold of the Langevin algorithm is sub-optimal with respect to the one given by AMP. This phenomenon is due to the residual glassiness present in that region of parameters. We present also a simple heuristic expression of the transition line which appears to be in agreement with the numerical results.

中文翻译:

嘈杂的高维推理中的Langevin算法的奇迹和陷阱

基于梯度下降的算法及其随机版本在机器学习和统计推断中具有广泛的应用。在这项工作中,我们对嘈杂的高维推理环境下的物理学中最普遍考虑的一种朗格文算法的性能进行了分析研究。我们采用Langevin算法对加标矩阵张量模型的后验概率度量进行采样。该算法的典型行为由一个称为兰格文状态演化的积分微分方程系统描述,该系统的解与近似消息传递(AMP)的状态演化之一进行比较。我们的结果表明,相对于AMP给出的阈值,Langevin算法的算法阈值是次优的。这种现象是由于在该参数区域中存在残留的玻璃光泽。我们还提出了一条过渡线的简单启发式表达式,该表达式似乎与数值结果一致。
更新日期:2020-01-06
down
wechat
bug