当前位置: X-MOL 学术Adv. Differ. Equ. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Novel extended dissipativity criteria for generalized neural networks with interval discrete and distributed time-varying delays
Advances in Difference Equations ( IF 3.1 ) Pub Date : 2021-01-09 , DOI: 10.1186/s13662-020-03210-x
Sunisa Luemsai , Thongchai Botmart , Wajaree Weera

The problem of asymptotic stability and extended dissipativity analysis for the generalized neural networks with interval discrete and distributed time-varying delays is investigated. Based on a suitable Lyapunov–Krasovskii functional (LKF), an improved Wirtinger single integral inequality, a novel triple integral inequality, and convex combination technique, the new asymptotic stability and extended dissipativity criteria are achieved for the generalized neural networks with interval discrete and distributed time-varying delays. By the above methods, the less conservative asymptotic stability criteria are obtained for a special case of the generalized neural networks. By using the Matlab LMI toolbox, the derived new asymptotic stability and extended dissipativity criteria are expressed in terms of linear matrix inequalities (LMIs) that cover \(H_{\infty }\), \(L_{2}\)\(L_{\infty }\), passivity, and dissipativity performance by setting parameters in the general performance index. Finally, we show numerical examples which are less conservative than other examples in the literature. Moreover, we present numerical examples for asymptotic stability and extended dissipativity performance of the generalized neural networks, including a special case of the generalized neural networks.



中文翻译:

具有间隔离散和分布时变时滞的广义神经网络的新型扩展耗散准则

研究了具有间隔离散和分布时变时滞的广义神经网络的渐近稳定性和扩展耗散性分析问题。基于合适的Lyapunov–Krasovskii泛函(LKF),改进的Wirtinger单积分不等式,新颖的三重积分不等式和凸组合技术,为区间离散和分布的广义神经网络实现了新的渐近稳定性和扩展耗散性准则随时间变化的延迟。通过上述方法,针对广义神经网络的特殊情况,获得了较不保守的渐近稳定性准则。通过使用Matlab LMI工具箱,导出的新渐近稳定性和扩展的耗散性标准表示为线性矩阵不等式(LMI),涵盖了\(H _ {\ infty} \)\(L_ {2} \)\(L _ {\ infty} \),无源和耗散性能,方法是在常规性能指标中设置参数。最后,我们给出了数值示例,这些示例不如文献中的其他示例保守。此外,我们提供了广义神经网络的渐近稳定性和扩展耗散性能的数值示例,包括广义神经网络的特殊情况。

更新日期:2021-01-10
down
wechat
bug