当前位置: X-MOL 学术SciPost Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the descriptive power of Neural-Networks as constrained Tensor Networks with exponentially large bond dimension
SciPost Physics ( IF 5.5 ) Pub Date : 2021-02-02 , DOI: 10.21468/scipostphyscore.4.1.001
Mario Collura 1, 2, 3 , Luca Dell'Anna 3 , Timo Felser 1, 3, 4 , Simone Montangero 3, 4
Affiliation  

In many cases, Neural networks can be mapped into tensor networks with an exponentially large bond dimension. Here, we compare different sub-classes of neural network states, with their mapped tensor network counterpart for studying the ground state of short-range Hamiltonians. We show that when mapping a neural network, the resulting tensor network is highly constrained and thus the neural network states do in general not deliver the naive expected drastic improvement against the state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model, addressing the lack of a detailed comparison of the expressiveness of these increasingly popular, variational ans\"atze.

中文翻译:

关于神经网络的描述能力,即约束维数成指数增长的张量网络

在许多情况下,神经网络可以映射到具有指数大键数的张量网络。在这里,我们将神经网络状态的不同子类与其映射的张量网络对应物进行比较,以研究短程哈密顿量的基态。我们显示出,当映射神经网络时,所得张量网络受到高度约束,因此,与最新的张量网络方法相比,神经网络状态通常不会提供幼稚的预期急剧改进。我们在两个典型的例子中明确地显示了这一结果,即一维铁磁伊辛模型和二维反铁磁海森堡模型,解决了这些日益流行的变体ans的表现力缺乏详细比较的问题。
更新日期:2021-02-02
down
wechat
bug