当前位置: X-MOL 学术Commun. Theor. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Power law decay of stored pattern stability in sparse Hopfield neural networks
Communications in Theoretical Physics ( IF 3.1 ) Pub Date : 2021-01-16 , DOI: 10.1088/1572-9494/abcfb0
Fei Fang , Zhou Yang , Sheng-Jun Wang

Hopfield neural networks on scale-free networks display the power law relation between the stability of patterns and the number of patterns. The stability is measured by the overlap between the output state and the stored pattern which is presented to a neural network. In simulations the overlap declines to a constant by a power law decay. Here we provide the explanation for the power law behavior through the signal-to-noise ratio analysis. We show that on sparse networks storing a plenty of patterns the stability of stored patterns can be approached by a power law function with the exponent −0.5. There is a difference between analytic and simulation results that the analytic results of overlap decay to 0. The difference exists because the signal and noise term of nodes diverge from the mean-field approach in the sparse finite size networks.



中文翻译:

稀疏Hopfield神经网络中存储模式稳定性的幂律衰减

无标度网络上的Hopfield神经网络显示了模式稳定性和模式数量之间的幂律关系。稳定性通过输出状态和存储到神经网络的存储模式之间的重叠来衡量。在仿真中,重叠因幂律衰减而下降到一个常数。在这里,我们通过信噪比分析来解释幂律行为。我们表明,在稀疏网络上存储大量模式的情况下,所存储模式的稳定性可以通过幂律函数(指数为-0.5)来实现。分析结果与仿真结果之间存在差异,即重叠分析的结果衰减为0。之所以存在差异,是因为在稀疏有限尺寸网络中,节点的信号项和噪声项与平均场法不同。

更新日期:2021-01-16
down
wechat
bug