当前位置: X-MOL 学术Commun. Partial Differ. Equ. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Global-in-time solutions and qualitative properties for the NNLIF neuron model with synaptic delay
Communications in Partial Differential Equations ( IF 2.1 ) Pub Date : 2019-07-15 , DOI: 10.1080/03605302.2019.1639732
María J. Cáceres 1 , Pierre Roux 2 , Delphine Salort 3 , Ricarda Schneider 4
Affiliation  

Abstract The Nonlinear Noisy Leaky Integrate and Fire (NNLIF) model is widely used to describe the dynamics of neural networks after a diffusive approximation of the mean-field limit of a stochastic differential equation system. When the total activity of the network has an instantaneous effect on the network, in the average-excitatory case, a blow-up phenomenon occurs. This article is devoted to the theoretical study of the NNLIF model in the case where a delay in the effect of the total activity on the neurons is added. We first prove global-in-time existence and uniqueness of “strong” solutions, independently of the sign of the connectivity parameter, that is, for both cases: excitatory and inhibitory. Secondly, we prove some qualitative properties of solutions: asymptotic convergence to the stationary state for weak interconnections and a non-existence result for periodic solutions if the connectivity parameter is large enough. The proofs are mainly based on an appropriate change of variables to rewrite the NNLIF equation as a Stefan-like free boundary problem, constructions of universal super-solutions, the entropy dissipation method and Poincaré’s inequality.

中文翻译:

具有突触延迟的 NNLIF 神经元模型的全局时间解和定性属性

摘要 非线性噪声泄漏积分和火灾 (NNLIF) 模型广泛用于描述随机微分方程系统的平均场极限扩散近似后的神经网络动力学。当网络的总活动量对网络产生瞬时影响时,在平均兴奋的情况下,会发生爆发现象。本文致力于在添加了总活动对神经元影响的延迟的情况下,NNLIF 模型的理论研究。我们首先证明了“强”解的全局时间存在性和唯一性,独立于连接参数的符号,即对于两种情况:兴奋性和抑制性。其次,我们证明解的一些定性性质:如果连通性参数足够大,则对于弱互连渐近收敛到静止状态,并且对于周期性解决方案不存在结果。证明主要基于变量的适当变化将NNLIF方程改写为类Stefan自由边界问题、通用超解的构造、熵耗散方法和庞加莱不等式。
更新日期:2019-07-15
down
wechat
bug