当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Synthesis of recurrent neural dynamics for monotone inclusion with application to Bayesian inference.
Neural Networks ( IF 6.0 ) Pub Date : 2020-08-12 , DOI: 10.1016/j.neunet.2020.07.037
Peng Yi 1 , ShiNung Ching 2
Affiliation  

We propose a top-down approach to construct recurrent neural circuit dynamics for the mathematical problem of monotone inclusion (MoI). MoI in a general optimization framework that encompasses a wide range of contemporary problems, including Bayesian inference and Markov decision making. We show that in a recurrent neural circuit/network with Poisson neurons, each neuron’s firing curve can be understood as a proximal operator of a local objective function, while the overall circuit dynamics constitutes an operator-splitting system of ordinary differential equations whose equilibrium point corresponds to the solution of the MoI problem. Our analysis thus establishes that neural circuits are a substrate for solving a broad class of computational tasks. In this regard, we provide an explicit synthesis procedure for building neural circuits for specific MoI problems and demonstrate it for the specific case of Bayesian inference and sparse neural coding.



中文翻译:

用于单调包含的递归神经动力学的合成及其在贝叶斯推理中的应用。

我们提出一种自上而下的方法来构造循环神经回路动力学,以解决单调包含(MoI)的数学问题。MoI包含了一系列广泛的当代问题,包括贝叶斯推理和马尔可夫决策。我们表明,在具有泊松神经元的递归神经回路/网络中,每个神经元的放电曲线可以理解为局部目标函数的近端算子,而整体电路动力学构成了一个常微分方程的算子分解系统,其平衡点对应解决MoI问题。因此,我们的分析确定了神经回路是解决广泛的计算任务的基础。在这方面,

更新日期:2020-08-18
down
wechat
bug