当前位置: X-MOL 学术Nat. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Variational neural annealing
Nature Machine Intelligence ( IF 23.8 ) Pub Date : 2021-10-25 , DOI: 10.1038/s42256-021-00401-3
Mohamed Hibat-Allah 1, 2 , Estelle M. Inack 1, 3, 4 , Roeland Wiersema 1, 2 , Juan Carrasquilla 1, 2 , Roger G. Melko 2, 3
Affiliation  

Many important challenges in science and technology can be cast as optimization problems. When viewed in a statistical physics framework, these can be tackled by simulated annealing, where a gradual cooling procedure helps search for ground-state solutions of a target Hamiltonian. Although powerful, simulated annealing is known to have prohibitively slow sampling dynamics when the optimization landscape is rough or glassy. Here we show that, by generalizing the target distribution with a parameterized model, an analogous annealing framework based on the variational principle can be used to search for ground-state solutions. Modern autoregressive models such as recurrent neural networks provide ideal parameterizations because they can be sampled exactly without slow dynamics, even when the model encodes a rough landscape. We implement this procedure in the classical and quantum settings on several prototypical spin glass Hamiltonians and find that, on average, it substantially outperforms traditional simulated annealing in the asymptotic limit, illustrating the potential power of this yet unexplored route to optimization.



中文翻译:

变分神经退火

科学技术中的许多重要挑战都可以归结为优化问题。当在统计物理框架中查看时,这些可以通过模拟退火来解决,其中逐渐冷却的过程有助于搜索目标哈密顿量的基态解。虽然功能强大,但众所周知,当优化环境粗糙或玻璃状时,模拟退火的采样动态非常缓慢。在这里,我们表明,通过使用参数化模型概括目标分布,可以使用基于变分原理的类似退火框架来搜索基态解。现代自回归模型(例如递归神经网络)提供了理想的参数化,因为即使模型编码粗糙的景观,它们也可以在没有缓慢动态的情况下精确采样。

更新日期:2021-10-25
down
wechat
bug