当前位置: X-MOL 学术Comput. Appl. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Discontinuous convex contractions and their applications in neural networks
Computational and Applied Mathematics ( IF 2.5 ) Pub Date : 2021-01-05 , DOI: 10.1007/s40314-020-01390-6
Ravindra K. Bisht , Nihal Özgür

In this paper, we show that the class of convex contractions of order \(m\in \mathbb {N}\) is strong enough to generate a fixed point but do not force the mapping to be continuous at the fixed point. As a by-product, we provide a new setting to answer an open question posed by Rhoades (Contemp Math 72:233–245, 1988). In recent years, neural network systems with discontinuous activation functions have received intensive research interest and some theoretical fixed point results (Brouwer’s fixed point theorem, Banach fixed point theorem, Kakutani’s fixed point theorem, Krasnoselskii fixed point theorem, etc.,) have been used in the theoretical studies of neural networks. Therefore, possible applications of our theoretical results can contribute to the study of neural networks both in terms of fixed point theory and discontinuity at fixed point.



中文翻译:

间断凸收缩及其在神经网络中的应用

在本文中,我们证明了\(m \ in \ mathbb {N} \)阶的凸收缩的类别足够强以生成固定点,但不强制映射在固定点处连续。作为副产品,我们提供了一个新的环境来回答Rhoades提出的一个开放性问题(Contemp Math 72:233–245,1988)。近年来,具有不连续激活函数的神经网络系统受到了广泛的研究兴趣,并且使用了一些理论上的不动点定理(Brouwer不动点定理,Banach不动点定理,Kakutani不动点定理,Krasnoselskii不动点定理等)。在神经网络的理论研究中。因此,我们的理论结果的可能应用可以为定点理论和定点不连续性方面的神经网络研究做出贡献。

更新日期:2021-01-05
down
wechat
bug