当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Generalized Guerra's interpolation schemes for dense associative neural networks.
Neural Networks ( IF 7.8 ) Pub Date : 2020-05-20 , DOI: 10.1016/j.neunet.2020.05.009
Elena Agliari 1 , Francesco Alemanno 2 , Adriano Barra 3 , Alberto Fachechi 3
Affiliation  

In this work we develop analytical techniques to investigate a broad class of associative neural networks set in the high-storage regime. These techniques translate the original statistical–mechanical problem into an analytical–mechanical one which implies solving a set of partial differential equations, rather than tackling the canonical probabilistic route. We test the method on the classical Hopfield model – where the cost function includes only two-body interactions (i.e., quadratic terms) – and on the “relativistic” Hopfield model — where the (expansion of the) cost function includes p-body (i.e., of degree p) contributions. Under the replica symmetric assumption, we paint the phase diagrams of these models by obtaining the explicit expression of their free energy as a function of the model parameters (i.e., noise level and memory storage). Further, since for non-pairwise models ergodicity breaking is non necessarily a critical phenomenon, we develop a fluctuation analysis and find that criticality is preserved in the relativistic model.



中文翻译:

密集联想神经网络的广义Guerra插值方案。

在这项工作中,我们开发了分析技术,以研究在高存储状态下设置的各种关联神经网络。这些技术将原始的统计-机械问题转化为分析-机械问题,这意味着要解决一组偏微分方程,而不是解决规范的概率路线。我们在经典的Hopfield模型(成本函数仅包括两体相互作用(即二次项))和“相对论” Hopfield模型(其中的(扩展)成本函数包括)上测试了该方法。p-身体(即程度 p)捐款。在复制对称假设下,我们通过获得它们的自由能作为模型参数(即噪声水平和内存存储)的函数的显式表达,来绘制这些模型的相图。此外,由于对于非成对模型,遍历遍历不一定是一种临界现象,因此我们进行了波动分析,发现在相对论模型中保留了临界。

更新日期:2020-05-20
down
wechat
bug