Space of Functions Computed by Deep-Layered Machines

Alexander Mozeika, Bo Li, and David Saad
Phys. Rev. Lett. 125, 168301 – Published 12 October 2020
PDFHTMLExport Citation

Abstract

We study the space of functions computed by random-layered machines, including deep neural networks and Boolean circuits. Investigating the distribution of Boolean functions computed on the recurrent and layer-dependent architectures, we find that it is the same in both models. Depending on the initial conditions and computing elements used, we characterize the space of functions computed at the large depth limit and show that the macroscopic entropy of Boolean functions is either monotonically increasing or decreasing with the growing depth.

  • Figure
  • Figure
  • Figure
  • Received 21 April 2020
  • Revised 18 August 2020
  • Accepted 9 September 2020

DOI:https://doi.org/10.1103/PhysRevLett.125.168301

© 2020 American Physical Society

Physics Subject Headings (PhySH)

Interdisciplinary PhysicsNetworksStatistical Physics & Thermodynamics

Authors & Affiliations

Alexander Mozeika1,*, Bo Li2,†, and David Saad2,‡

  • 1London Institute for Mathematical Sciences, London W1K 2XF, United Kingdom
  • 2Nonlinearity and Complexity Research Group, Aston University, Birmingham B4 7ET, United Kingdom

  • *Corresponding author. alexander.mozeika@kcl.ac.uk
  • Corresponding author. b.li10@aston.ac.uk
  • Corresponding author. d.saad@aston.ac.uk

Article Text (Subscription Required)

Click to Expand

Supplemental Material (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 125, Iss. 16 — 16 October 2020

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review Letters

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×