Abstract
In this paper, we show that the class of convex contractions of order \(m\in \mathbb {N}\) is strong enough to generate a fixed point but do not force the mapping to be continuous at the fixed point. As a by-product, we provide a new setting to answer an open question posed by Rhoades (Contemp Math 72:233–245, 1988). In recent years, neural network systems with discontinuous activation functions have received intensive research interest and some theoretical fixed point results (Brouwer’s fixed point theorem, Banach fixed point theorem, Kakutani’s fixed point theorem, Krasnoselskii fixed point theorem, etc.,) have been used in the theoretical studies of neural networks. Therefore, possible applications of our theoretical results can contribute to the study of neural networks both in terms of fixed point theory and discontinuity at fixed point.
Similar content being viewed by others
1 Introduction
The well-known Banach–Picard–Caccioppoli contraction principle states that:
Theorem 1.1
Let T be a self-mapping of a complete metric space (X, d) such that \( d(Tx, Ty) \le ad(x, y), 0\le a<1,\) for each \(x, y \in X\). Then, T has a unique fixed point. Moreover, the Picard iteration \(\{x_n\}\) defined by \( x_{n+1} = Tx_n, (n = 0, 1, 2, \ldots )\) converges to the unique fixed point \( x_* \in X\) for any initial value \(x_0 \in X\).
The mapping T of Banach–Picard–Caccioppoli contraction is continuous in the entire domain of X.
In 1981, Istr\(\check{a}\)tescu Istrǎtescu (1981b) extended the well-known Banach–Picard–Caccioppoli contraction principle by introducing a convexity condition, namely convex contraction mapping of order m, where \(m\ge 2\). Meanwhile, a more complete study (data dependence, well-posedness, Ulam–Hyers stability, limit shadowing property and Ostrowski property) for convex contraction mapping of order 2 was recently proposed in Mureşan and Mureşan (2015) (see also Rus 2016). In Georgescu (2017), Georgescu initiated study of iterated function system consisting of generalized convex contractions.
Definition 1.1
A continuous function \(T:X\rightarrow X\), where (X, d) is a complete metric space is called convex contraction of order m (also called generalized convex contraction Georgescu 2017) if there exist \(m\in \mathbb {N}\) and \( a_{0},a_{1},\ldots ,a_{m-1}\ge 0\) such that \(\sum _{j=0}^{m-1}a_{j}<1\) and
for each \(x,y\in X\), where by \(T^{j}\), we mean the composition of T by itself j times.
Theorem 1.2
Let T be a continuous self-mapping of a complete metric space (X, d) satisfying (1.1). Then, T has a unique fixed point.
It is easy to observe that convex contraction of order \(m-1\) is convex contraction of order m but converse need not be true.
Example 1.1
Let \(X=\mathbb {N}\cup \{0\}\) and d be the usual metric on X. Define \( T:X\rightarrow X\) by
Then, T satisfies the condition \(d(T^{3}x,T^{3}y)\le \sum _{j=0}^{2}a_{j}d(T^{j}x,T^{j}y),\) for all \(x,y\in X\), but \( d(T^{2}x,T^{2}y)>\sum _{j=0}^{1}a_{j}d(T^{j}x,T^{j}y),\)(take \(x=4,y=5\) Sastry et al. 2012).
Recall that the set \(O(x;T) = \{T^{n}x : n = 0, 1, 2,\ldots \}\) is called the orbit of the self-mapping T at the point \(x \in X\).
Definition 1.2
A self-mapping T of a metric space (X, d) is called orbitally continuous at a point \(z\in X\) if for any sequence \(\{x_{n}\}\subset O(x;T)\) (for some \(x\in X)\) \(x_{n}\rightarrow z\) implies \(Tx_{n}\rightarrow Tz\) as \( n\rightarrow \infty \).
Every continuous self-mapping of a metric space is orbitally continuous, but the converse need not be true. The following example illustrates this fact.
Example 1.2
Let \(X=[0,1]\) and d be the usual metric on X. Define \(T:X\rightarrow X\) by
Then, T is orbitally continuous. However, T is not continuous at \(x=1/3\).
Definition 1.3
A self-mapping T of a metric space (X, d) is called k-continuous Pant and Pant (2017), \(k = 1, 2, 3,\ldots ,\) if \(T^{k}x_n \rightarrow Tz\), whenever \(\{x_n\}\) is a sequence in X such that \(T^{k-1}x_n \rightarrow z\).
Remark 1.1
It is important to note that for a self-mapping T of a metric space (X, d), the notion of 1-continuity coincides with continuity. However,
1-continuity \(\Rightarrow \) 2-continuity \(\Rightarrow \) 3-continuity \( \Rightarrow \cdots ,\)
but not conversely. In Example 1.2, one may observe that \(Tx_n \rightarrow t \Rightarrow T^{2}x_n \rightarrow Tt \) since \(Tx_n \rightarrow t\) implies \( t=0 \) or \(t=1/3\) and \(T^{2}x_{n} \rightarrow 1/3 =T1/3\) for all n. Hence, T is 2-continuous. However, T is not 1-continuous at \(x=1/3\).
The study of contractive conditions which does not force the mappings to be continuous at the fixed point is presently an active area of research. The question whether there exists a contractive definition which is strong enough to generate a fixed point but which does not force the mapping to be continuous at the fixed point was reiterated by Rhoades (1988) as an existing open problem. Few answers to the open question have been given in Bisht and Pant (2017), Bisht (2017), Bisht and Rakočević (2018), Pant (1999), Pant et al. (2019, 2020), and Pant and Pant (2017).
In this paper, we present new fixed point theorems using the notions of orbital continuity or k-continuity of self-mappings on a complete metric space. Our results provide new solutions to Rhoades open question and have significance in both theory and application. It is well known that some fixed point results such as Brouwer’s fixed point theorem, Banach fixed point theorem, Kakutani’s fixed point theorem and Krasnoselskii fixed point theorem, have been used in the theoretical studies of neural networks. Activation functions are basically used in a neural network to transform an input signal into an output signal. A suitable and more generalized activation function can greatly improve the performance of neural networks (Zhang et al. 2014). For example, structures of the activation functions have an important influence on the multiple stability of the neural networks as the different structure of activation function will affect the number of equilibria and the number of stable equilibria of neural networks (Huang et al. 2019b). Continuity or discontinuity of an activation function is also an important aspect for the stability issue of neural networks (for more details see Huang et al. 2012, 2019a; Nie and Zheng 2014; Nie et al. 2019a, b; Zhang et al. 2014 and the references therein). Especially, the network model with discontinuous activation function has a wide range of applications in real-life problems ranging from image compression to character recognition, stock market prediction, etc., (for some examples see Huang et al. 2019b, 2012; Nie and Cao 2012; Nie and Zheng 2014). Forti and Nistri (2003) initiated the study of discontinuous-type neural networks to solve non-negative sparse approximation problems to model the mammalian olfactory system. Therefore, designing a new neural network with a more generalized activation function is very significant. We think that new answers of the Rhoades open question and theoretical fixed point results can contribute to design of several kind neural networks under suitable conditions. In a recent work, possible applications of the notion of k-continuity of self-mappings to the theory of neural networks was stated as an open problem in Bisht and Özgür (2020).
2 Main results
Our first main result is the following:
Theorem 2.1
Let T be a self-mapping of a complete metric space (X, d) such that for each \(x,y\in X\);
where \(a_{0},a_{1},\ldots ,a_{m-1}\ge 0\) such that \(\sum _{j=0}^{m-1}a_{j}<1\). Suppose T is orbitally continuous or T is k-continuous. Then, T has a unique fixed point.
Proof
Let \(x_{0}\) be any point in X. If \(Tx_{0}=x_{0}\) then we are done, hence assume \(Tx_{0}\ne x_{0}\). Define a sequence \(\{x_{n}\}\) in X given by the rule \(x_{n+1}=T^{n}x_{0}=Tx_{n}\) for all \(n\in \mathbb {N}\cup \{0\}\). Set
Now we claim that for all \(n\ge m\),
where \([\frac{n}{m}]\) is the integral part of \(\frac{n}{m}\).
Obviously, (2.3) holds for \(n=m\). Assuming (2.3) is true for some n, that is,
Consider
Thus, induction argument shows that
We shall now show that \(\{x_{n}\}\) is a Cauchy sequence. For \(p<n\) and \( n=p+k,n>p\ge m\), we have
(since the first m terms in the sum are equal and the next m terms are equal and so on)
From these estimates, we conclude that \(\{x_{n}\}\) is a Cauchy sequence. Since X is complete, there exists a point \(z\in X\) such that \( x_{n}\rightarrow z\) as \(n\rightarrow \infty \). Orbital continuity of T implies that \(lim_{n\rightarrow \infty }Tx_{n}=Tz.\) This yields \(Tz=z,\) that is, z is a fixed point of T. Uniqueness of the fixed point follows easily.
Now suppose that T is k-continuous. Since \(T^{k-1}x_{n}\rightarrow z\), k-continuity of T implies that \(\mathrm{lim}_{n\rightarrow \infty }T^{k}x_{n}=Tz\) . This yields \(z=Tz,\) that is, z is a fixed point of T. \(\square \)
We now give an example to show that convex contraction mapping of order 3 is strong enough to generate a fixed point but may to be continuous at the fixed point.
Example 2.1
Let \(X=[-2,4]\) and d be the usual metric on X. Define \(T:X\rightarrow X\) by
Then, T satisfies all the conditions of Theorem 2.1\((m\ge 3)\) and has a unique fixed point \(x=1\) at which T is discontinuous. It may be noted that T is orbitally continuous.
The following example depicts that Theorem 2.1 is not valid if we replace the convex coefficients \(a_{0},a_{1},\ldots ,a_{m-1}\) by their sum \( \alpha =\sum _{j=0}^{m-1}a_{j}\).
Example 2.2
Let \(X=[0,\infty )\) and d be the usual metric on X. Define \( T:X\rightarrow X\) by
Then, T satisfies
but T is fixed point free.
Putting \(m=2\) in Theorem 2.1, we get the following corollary which is an extended version of Istr\(\check{a}\)tescu’s result for convex contraction mapping of order 2.
Corollary 2.1
Let T be a self-mapping of a complete metric space (X, d) such that for each \(x, y \in X\);
where \(a_{0}, a_{1} \ge 0\) such that \(a_{0}+ a_{1} <1\). Suppose T is orbitally continuous or T is k-continuous. Then, T has a unique fixed point.
In the next theorem, we replace condition (2.1) of Theorem 2.1 by a more general condition, which does not imply continuity at the fixed point.
Definition 2.1
A function \(\varphi : [0,\infty ) \rightarrow [0,\infty )\) is called a comparison function if:
- (i):
-
\(\varphi \) is increasing;
- (i):
-
\(lim_{n\rightarrow \infty }\varphi ^{n}(t)=0\) for every \( t\in [0,\infty )\).
It is well known that a comparison function \(\varphi \) has the property that \( \varphi (0) = 0\) and \(\varphi (t) < t\) for every \(t \in (0,\infty )\).
Theorem 2.2
Let T be a self-mapping of a complete metric space (X, d) such that for each \(x,y\in X\);
where \(\varphi \) is a comparison function and \(m\in \mathbb {N}\). Suppose that T is k-continuous. Then, T has a unique fixed point.
Proof
Let \(x_{0}\) be any point in X. If \(Tx_{0}=x_{0}\) then we are done, hence assume \(Tx_{0}\ne x_{0}\). Define a sequence \(\{x_{n}\}\) in X given by the rule \(x_{n+1}=T^{n}x_{0}=Tx_{n}\) for all \(n\in \mathbb {N}\cup \{0\}\). If \( x_{n}=x_{n+1}\) for some n then \(x_{n}=x_{n+1}=x_{n+2}=\cdots \), i.e., \( \{x_{n}\} = \{T^{n}x_{0}\}\) is a Cauchy sequence and \(x_{n}\) is a fixed point of T. We can, therefore, assume that \(x_{n} \ne x_{n+1}\) for each n. Then, using the argument employed by Miculescu and Mihail (2017) (for constant \(s=1\)), it follows that \(\{x_{n}\}\) is a Cauchy sequence. Rest of the proof follows from the proof of Theorem 2.1. \(\square \)
The following example illustrates Theorem 2.2.
Example 2.3
Let \(X=[0,2]\) and d be the usual metric on X. Define \(T:X\rightarrow X\) by
Then, T satisfies all the conditions of Theorem 2.2 and has a unique fixed point \(x=1\). The mapping T satisfies the condition (2.7) for \(m\ge 2\) with
and \(lim_{n\rightarrow \infty }\varphi ^{n}(t)=0\) for every \(t\in [0,\infty )\). It can also be easily seen that T is discontinuous at the fixed point \(x=1\).
Remark 2.1
Theorem 2.2 is an effective generalization of Theorem 2.1 if we put \(\varphi (t)=(\sum _{j=0}^{m-1}a_{j})t\).
In view of a result of Pant and Pant (2017), we now state a new fixed point theorem by considering the restriction \(\lim _{n\rightarrow \infty }\varphi ^{n}(t)=0\) in the open interval (0, d(T(X))) instead of in whole of \([0,\infty )\), where d(T(X)), the diameter of the range of T.
Theorem 2.3
Let T be a self-mapping of a complete metric space (X, d) such that for each \(x,y\in X\) satisfying (2.7), where \(\varphi \) is a comparison function with \(\lim _{n\rightarrow \infty }\varphi ^{n}(t)=0\) for each t in the open interval (0, d(T(X))) and \(m\in \mathbb {N}\). Suppose that T is k-continuous. Then, T has a unique fixed point.
Remark 2.2
Using the notion of k-continuity, a host of theorems proved in Alghamdi et al. (2011), Ghorbanian et al. (2012), Hussain and Salimi (2015), Latif et al. (2015), Latif et al. (2016), Miandaragh et al. (2013), Miculescu and Mihail (2017) and Sastry et al. (2012) can also be extended to a wider class of mappings which need not be continuous at the fixed point.
Remark 2.3
The above theorems generalize and subsume various results due to Istrǎtescu (1981b, 1982), Boyd and Wong (1969), Matkowski (1975), Miculescu and Mihail (2017), Rakotch (1962), Rus (2016, 2001) and Sastry et al. (2012). In addition to it, we provide more answers to the open problem regarding to the existence of contractive mappings which have fixed points but are discontinuous at the fixed point (Rhoades 1988).
3 Examples of discontinuous activation functions
In the previous section, we have proved new fixed point results that require the uniqueness of the fixed point, but do not require the self-mapping to be continuous at the fixed point. Such mappings are frequently appeared in the study of neural networks as activation functions. In Sharma et al. (2020), a brief description of various activation functions used in the study of artificial neural networks was given and explained the need of activation functions. Mainly, activation functions are very important because of their role in learning and making sense of non-linear and complicated mappings between the inputs and corresponding outputs in a neural network (for more details and examples of activation functions see Calin 2020; Sharma et al. 2020 and the references therein). One of the main types of activation functions is the unit step function (or binary step function or Heaviside function). Heaviside function H is defined by
Notice that \(Hx_{n}\rightarrow t\) implies \(H^{2}x_{n}\rightarrow Ht\) since \( Hx_{n}\rightarrow t\) implies \(t=0\) or \(t=1\) and \(H^{2}x_{n}\rightarrow 1=H1\) for all n. Then, H is 2-continuous. It is easy to verify that the unit step function H verifies the conditions of Theorem 2.1 for \(m=2\) and any numbers \(a_{0},a_{1}\ge 0\) with \(a_{0}+a_{1}<1\). H has a unique fixed point \(x=1\) and continuous at its fixed point (see Fig. 1 which is drawn using Mathematica Wolfram Research 2019). This example shows the effectiveness of our results.
Another example of a discontinuous activation function is the function G defined by
(see Huang et al. 2013 for more details). G has a unique fixed point \(x=0\) and discontinuous at this fixed point (see Fig. 2). In Huang et al. (2013), the existence of positive periodic solution for a class of delayed neural networks was presented based on a new method involving the application of set-valued version of Krasnoselskii’s fixed point theorem in a cone and the framework of Filippov differential inclusions.
In recent years, the dynamical behavior of neural networks with discontinuous activation functions has been extensively analyzed [(for example see (Du and Xu 2015; Du et al. 2013; Forti and Nistri 2003; Huang et al. 2019a, b, 2012; Kong et al. 2019; Liu et al. 2020, 2014; Nie and Cao 2012; Nie and Zheng 2014; Wang et al. 2010)]. In Huang et al. (2019a), it was stated that one of the main advantages of the incorporation of discontinuous activation functions is the higher storage capacity of neural networks than continuous ones. Now, we consider two more examples of discontinuous activation functions. We see that one of them is continuous at its unique fixed point while the other is discontinuous.
In Liu et al. (2014), finite-time stabilization of neural networks with discontinuous activations via discontinuous controllers was discussed. In numerical examples, the discontinuous activation function S defined by
was used. Clearly, the point \(x=0\) is the unique fixed point of S and S is discontinuous at its fixed point (see Fig. 3).
In Huang et al. (2012), the multistability and multiperiodicity issues of 2n -dimensional delayed bidirectional associative memory (BAM) neural networks with r-level discontinuous activation functions were discussed. Sufficient conditions were established to ensure the existence of \(r^{n}\) locally exponentially stable equilibria for 2n-dimensional BAM neural networks with discontinuous activation functions due to Brouwer’s fixed point theorem and stability analysis. In numerical simulations, the following discontinuous activation function T was used:
Notice that this discontinuous activation function T has the unique fixed point \(x=-2\) and continuous at this fixed point (see Fig. 4).
For more examples, one can see Bisht and Özgür (2020), Cai et al. (2018), Calin (2020), Du and Xu (2015), Du et al. (2013), Forti and Nistri (2003), Huang et al. (2019a, 2019b, 2012), Kong et al. (2019), Liu et al. (2020), Nie and Cao (2012), Nie and Zheng (2014), Nie et al. (2019a, 2019b), Pant et al. (2019, 2020), Wang et al. (2010), Zhang et al. (2014) and the references therein.
References
Alghamdi MA, Alnafei SH, Radenovic S, Shahzad N (2011) Fixed point theorems for convex contraction mappings on cone metric spaces. Math Comput Model 54:2020–2026
Bisht RK, Özgür N (2020) Geometric properties of discontinuous fixed point set of (\(\epsilon -\delta \)) contractions and applications to neural networks. Aequationes Math 94(5):847–863
Bisht RK, Pant RP (2017) A remark on discontinuity at fixed point. J Math Anal Appl 445:1239–1241
Bisht RK (2017) A remark on the result of Radu Miculescu and Alexandru Mihail. J Fixed Point Theory Appl 19(4):2437–2439
Bisht RK, Rakočević V (2018) Generalized Meir–Keeler type contractions and discontinuity at fixed point. Fixed Point Theory 19:57–64
Boyd DW, Wong JS (1969) On nonlinear contractions. Proc Am Math Soc 20:458–464
Cai X, Huang J, Huang L (2018) Periodic orbit analysis for the delayed Filippov system. Proc Am Math Soc 146(11):4667–4682
Calin O (2020) Activation functions. In: Deep learning architectures. Springer Series in the Data Sciences. Springer, Cham. https://doi.org/10.1007/978-3-030-36721-3_2
Du Y, Xu R (2015) Multistability and multiperiodicity for a class of Cohen–Grossberg BAM neural networks with discontinuous activation functions and time delays. Neural Process Lett 42(2):417–435
Du Y, Li Y, Xu R (2013) Multistability and multiperiodicity for a general class of delayed Cohen–Grossberg neural networks with discontinuous activation functions. Discrete Dyn Nat Soc. https://doi.org/10.1155/2013/917835
Forti M, Nistri P (2003) Global convergence of neural networks with discontinuous neuron activations. IEEE Trans Circuits Syst I Fundam Theory Appl 50(11):1421–1435
Georgescu F (2017) IFSs consisting of generalized convex contractions. An St Univ Ovidius Constanta 25(1):77–86
Ghorbanian V, Rezapour S, Shahzad N (2012) Some ordered fixed point results and the property (P). Comput Math Appl 63:1361–1368
Huang YJ, Chen SJ, Yang XH, Xiao J (2019a) Coexistence and local Mittag–Leffler stability of fractional-order recurrent neural networks with discontinuous activation functions. Chin Phys B 28(4):040701
Huang Y, Yuan X, Long H, Fan X, Cai T (2019b) Multistability of fractional-order recurrent neural networks with discontinuous and nonmonotonic activation functions. IEEE Access 7:116430–116437
Huang L, Cai Z, Zhang L, Duan L (2013) Dynamical behaviors for discontinuous and delayed neural networks in the framework of Filippov differential inclusions. Neural Netw 48:180–194
Huang Y, Zhang H, Wang Z (2012) Multistability and multiperiodicity of delayed bidirectional associative memory neural networks with discontinuous activation functions. Appl Math Comput 219(3):899–910
Hussain N, Salimi P (2015) Fixed points for generalized \(\psi -\) contraction with application to intergral equations. J Nonlinear Convex Anal 16(4):711–729
Istrǎtescu VI (1981) Fixed point theory: An introduction, Mathematics and Its Applications, vol 7. D. Reidel Publishing Company, Dordrecht, Holland, p xv + 466 pp
Istrǎtescu VI (1981) Some fixed point theorems for convex contraction mappings and convex nonexpansive mapping I. Lib Math 1:151–163
Istrǎtescu VI (1982) Some fixed point theorems for convex contraction mappings and mappings with convex diminishing diameters—I. Ann di Mat 130(1):89–104
Kong F, Zhu Q, Liang F, Nieto JJ (2019) Robust fixed-time synchronization of discontinuous Cohen–Grossberg neural networks with mixed time delays. Nonlinear Anal Modell Control 24(4):603–625
Latif A, Sintunavarat W, Ninsri A (2015) Approximate fixed point theorems for partial generalized convex contraction in \(\alpha \)-complete metric spaces. Taiwan J Math 19(1):315–333
Latif A, Ninsri A, Sintunavarat W (2016) The \((\alpha,\beta )\)-generalized convex contractive condition with approximate fixed point results and some consequence. Fixed Point Theory Appl 2016:58. https://doi.org/10.1186/s13663-016-0546-z
Liu M, Wu H, Zhao W (2020) Event-triggered stochastic synchronization in finite time for delayed semi-Markovian jump neural networks with discontinuous activations. Comput Appl Math 39:1–47
Liu X, Park JH, Jiang N, Cao J (2014) Nonsmooth finite-time stabilization of neural networks with discontinuous activations. Neural Netw 52:25–32
Matkowski J (1975) Integrable solutions of functional equations. Diss Math 127:1–68
Miandaragh MA, Postolache M, Rezapour S (2013) Approximate fixed points of generalized convex contractions. Fixed Point Theory Appl 2013:255 8 pp
Miculescu R, Mihail A (2017) A generalization of Matkowski’s fixed point theorem and Istrǎtescu’s fixed point theorem concerning convex contractions. J Fixed Point Theory Appl 19(2):1525–1533
Mureşan V, Mureşan AS (2015) On the theory of fixed point theorems for convex contraction mappings. Carpath J Math 31(3):365–371
Nie X, Cao J (2012) Existence and global stability of equilibrium point for delayed competitive neural networks with discontinuous activation functions. Int J Syst Sci 43(3):459–474
Nie X, Zheng W X (2014) On multistability of competitive neural networks with discontinuous activation functions. In: Proceedings of the 4th Australian Control Conference (Aucc2014), 17th-18th November, 2014, Canberra, Australia, pp 245–250. https://doi.org/10.1109/AUCC.2014.7358690
Nie X, Cao J, Fei S (2019a) Multistability and instability of competitive neural networks with non-monotonic piecewise linear activation functions. Nonlinear Anal Real World Appl 45:799–821
Nie X, Liang J, Cao J (2019b) Multistability analysis of competitive neural networks with Gaussian-wavelet-type activation functions and unbounded time-varying delays. Appl Math Comput 356:449–468
Pant RP (1999) Discontinuity and fixed points. J Math Anal Appl 240:284–289
Pant RP, Özgür NY, Taş N (2020) On discontinuity problem at fixed point. Bull Malays Math Sci Soc 43(1):499–517
Pant RP, Özgür NY, Taş N (2019) Discontinuity at fixed points with applications. Bull Belg Math Soc Simon Stevin 25(4):571–589
Pant A, Pant RP (2017) Fixed points and continuity of contractive maps. Filomat 31(11):3501–3506
Rakotch E (1962) A note on contraction mappings. Proc Am Math Soc 13:459–465
Rhoades BE (1988) Contractive definitions and continuity. Contemp Math 72:233–245
Rus IA (2016) Some variants of contraction principle, generalizations and applications. Stud Univ Babes Bolyai Math 61(3):343–358
Rus IA (2001) Generalized contractions and applications. Cluj University Press, Cluj-Napoca
Sastry KPR, Rao CS, Sekhar AC, Balaiah M (2012) Fixed point theorem for cone convex contractions of order \(m\ge 2\). Int J Math Sci Eng Appl 6(1):263–271
Sharma S, Sharma S, Athaiya A (2020) Activation functions in neural networks. Int J Eng Appl Sci Technol 4(12):310–316
Wang Z, Huang L, Zuo Y, Zhang L (2010) Global robust stability of time-delay systems with discontinuous activation functions under polytopic parameter uncertainties. Bull Korean Math Soc 47(1):89–102
Wolfram Research (2019) Inc., Mathematica, Version 12.0, Champaign IL (2019)
Zhang H, Wang Z, Liu D (2014) A comprehensive review of stability analysis of continuous-time recurrent neural networks. IEEE Trans Neural Netw Learn Syst 25(7):1229–1262
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by José Tenreiro Machado.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Bisht, R.K., Özgür, N. Discontinuous convex contractions and their applications in neural networks. Comp. Appl. Math. 40, 11 (2021). https://doi.org/10.1007/s40314-020-01390-6
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-020-01390-6