当前位置: X-MOL 学术J. Math. Neurosc. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Robust Exponential Memory in Hopfield Networks.
The Journal of Mathematical Neuroscience Pub Date : 2018-01-16 , DOI: 10.1186/s13408-017-0056-2
Christopher J Hillar 1 , Ngoc M Tran 2
Affiliation  

The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch–Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon’s noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.

中文翻译:

Hopfield网络中的强大指数存储。

Hopfield递归神经网络是一种经典的自动联想记忆模型,其中对称耦合的McCulloch-Pitts二进制神经元的集合相互作用以执行紧急计算。尽管以前的研究人员已经探索了该网络解决组合优化问题或将重复发生的活动模式作为确定性动力学的吸引者的潜力,但一个基本的开放问题是设计一个Hopfield网络系列,该网络具有许多耐噪声的内存,并且呈指数增长与神经人口规模。在这里,我们通过最小化概率流发现了这样的网络,这是最近提出的用于估计离散最大熵模型中参数的目标。通过降低凸概率流的梯度,我们的网络可以调整突触权重以实现健壮的指数存储,即使呈现很少数量的训练模式。这些网络除了提供一组新的低密度纠错码以实现Shannon的嘈杂信道限制外,还有效地解决了计算机科学中的隐藏团簇问题的变体,从而为计算模型的实际应用开辟了新途径来自生物学。
更新日期:2018-01-16
down
wechat
bug