• Open Access

Replacing Neural Networks by Optimal Analytical Predictors for the Detection of Phase Transitions

Julian Arnold and Frank Schäfer
Phys. Rev. X 12, 031044 – Published 28 September 2022

Abstract

Identifying phase transitions and classifying phases of matter is central to understanding the properties and behavior of a broad range of material systems. In recent years, machine-learning (ML) techniques have been successfully applied to perform such tasks in a data-driven manner. However, the success of this approach notwithstanding, we still lack a clear understanding of ML methods for detecting phase transitions, particularly of those that utilize neural networks (NNs). In this work, we derive analytical expressions for the optimal output of three widely used NN-based methods for detecting phase transitions. These optimal predictions correspond to the results obtained in the limit of high model capacity. Therefore, in practice, they can, for example, be recovered using sufficiently large, well-trained NNs. The inner workings of the considered methods are revealed through the explicit dependence of the optimal output on the input data. By evaluating the analytical expressions, we can identify phase transitions directly from experimentally accessible data without training NNs, which makes this procedure favorable in terms of computation time. Our theoretical results are supported by extensive numerical simulations covering, e.g., topological, quantum, and many-body localization phase transitions. We expect similar analyses to provide a deeper understanding of other classification tasks in condensed matter physics.

  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
13 More
  • Received 21 February 2022
  • Revised 2 July 2022
  • Accepted 3 August 2022

DOI:https://doi.org/10.1103/PhysRevX.12.031044

Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.

Published by the American Physical Society

Physics Subject Headings (PhySH)

Condensed Matter, Materials & Applied PhysicsInterdisciplinary Physics

Authors & Affiliations

Julian Arnold1,* and Frank Schäfer1,2,†

  • 1Department of Physics, University of Basel, Klingelbergstrasse 82, 4056 Basel, Switzerland
  • 2CSAIL, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA

  • *julian.arnold@unibas.ch
  • franksch@mit.edu

Popular Summary

Neural networks have successfully been used to identify phase transitions from data and classify data into distinct phases in an automated fashion. The power and success of such approaches can be attributed to the ability of large neural networks to learn arbitrary functions. However, the larger a neural network, the more computational resources are needed to train it, and the more difficult it is to understand its decision making. Here, we establish a solid theoretical foundation of three popular machine-learning methods that rely on neural networks for detecting phase transitions.

We derive analytical expressions for the optimal output of these methods, which corresponds to the output when using sufficiently large neural networks after ideal training. The explicit dependence of the optimal output on the input data provides us with a thorough understanding of the inner workings of the machine-learning methods. Most notably, we find that the methods inherently rely on detecting changes in the probability distributions governing the input data, and they are not explicitly based on learning order parameters—that is, recognizing prevalent patterns or orderings—as is commonly believed.

Ultimately, our conceptual advances also result in a more efficient numerical routine to probe the limit of large neural networks. This routine is based on evaluating the analytical expressions to construct optimal outputs and not on the training of neural networks. We demonstrate the power of this approach by detecting topological, quantum, and many-body localization phase transitions from data.

Our work paves the way to understanding and solving other classification tasks in condensed matter physics that previously relied on training neural networks.

Key Image

Article Text

Click to Expand

References

Click to Expand
Issue

Vol. 12, Iss. 3 — July - September 2022

Subject Areas
Reuse & Permissions
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review X

Reuse & Permissions

It is not necessary to obtain permission to reuse this article or its components as it is available under the terms of the Creative Commons Attribution 4.0 International license. This license permits unrestricted use, distribution, and reproduction in any medium, provided attribution to the author(s) and the published article's title, journal citation, and DOI are maintained. Please note that some figures may have been included with permission from other third parties. It is your responsibility to obtain the proper permission from the rights holder directly for these figures.

×

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×