当前位置: X-MOL 学术Neural Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Resonator Networks, 2: Factorization Performance and Capacity Compared to Optimization-Based Methods
Neural Computation ( IF 2.9 ) Pub Date : 2020-12-01 , DOI: 10.1162/neco_a_01329
Spencer J Kent 1 , E Paxon Frady 2 , Friedrich T Sommer 2 , Bruno A Olshausen 3
Affiliation  

We develop theoretical foundations of resonator networks, a new type of recurrent neural network introduced in Frady, Kent, Olshausen, and Sommer (2020), a companion article in this issue, to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures. Given a composite vector formed by the Hadamard product between a discrete set of high-dimensional vectors, a resonator network can efficiently decompose the composite into these factors. We compare the performance of resonator networks against optimization-based methods, including Alternating Least Squares and several gradient-based algorithms, showing that resonator networks are superior in several important ways. This advantage is achieved by leveraging a combination of nonlinear dynamics and searching in superposition, by which estimates of the correct solution are formed from a weighted superposition of all possible solutions. While the alternative methods also search in superposition, the dynamics of resonator networks allow them to strike a more effective balance between exploring the solution space and exploiting local information to drive the network toward probable solutions. Resonator networks are not guaranteed to converge, but within a particular regime they almost always do. In exchange for relaxing the guarantee of global convergence, resonator networks are dramatically more effective at finding factorizations than all alternative approaches considered.

中文翻译:

谐振器网络,2:与基于优化的方法相比的因式分解性能和容量

我们开发了谐振器网络的理论基础,这是本期姊妹篇 Frady、Kent、Olshausen 和 Sommer (2020) 中介绍的一种新型循环神经网络,用于解决矢量符号架构中出现的高维矢量分解问题。给定由一组离散高维向量之间的 Hadamard 乘积形成的复合向量,谐振器网络可以有效地将复合向量分解为这些因子。我们将谐振器网络的性能与基于优化的方法(包括交替最小二乘法和几种基于梯度的算法)进行比较,表明谐振器网络在几个重要方面具有优越性。这一优点是通过利用非线性动力学和叠加搜索的组合来实现的,通过叠加搜索,正确解的估计是根据所有可能解的加权叠加形成的。虽然替代方法也在叠加中进行搜索,但谐振器网络的动态特性使它们能够在探索解空间和利用局部信息来驱动网络寻找可能的解决方案之间取得更有效的平衡。谐振器网络不能保证收敛,但在特定的范围内它们几乎总是会收敛。作为放宽全局收敛保证的交换,谐振器网络在寻找因子分解方面比考虑的所有替代方法要有效得多。
更新日期:2020-12-01
down
wechat
bug