您的当前位置:首页正文

不连续激励函数时滞Cohen-Grossberg神经网络的动力学性质

2020-05-14 来源:欧得旅游网
维普资讯 http://www.cqvip.com

第34卷第3期 湖南农业大学学报(自然科学版) Vb1.34 NO.3 2008年6月 Journal of Hunan Agricultural University(Natural Sciences) Jun.20o8 文章编号:1007—1032(2008)03—0374—05 Regular dynamics in delayed Cohen--Grossberg neural UOUS activation functions LI Xu-meng 2 HUANG Li.hong ,WANG Xiao.hui (1.CollegeofMathematics andEconometrics,HNU,Changsha410080,China;2.College ofScience,HNAU,changsha 410128,China) Abstract:A class of Cohen—Grossberg neural networks where the neuron activations are modeled by discontinuous functions was considered.A tool,the chain rule for computing the time derivative along hte neural network solutions of a nondifferentiable Lyapunov function,is used which enables US to apply a Lyapunov—like approach to differential equations with discontinuous right—hand side.By means of the Lyapunov—like approach,a general result is proved on global exponential convergence toward a unique equilibrium point of the neural network solutions in the sense of Filippov. Keywords:Cohen—Grossbergneural networks;global exponential stability;nonlinearmeasure;M。matirx 不连续激励函数时滞Cohen—Grossberg神经网络的 动力学性质 李绪孟 ,黄立宏’,王小卉 (1.湖南大学数学与计量经济学院,湖南长沙410080;2.湖南农业大学理学院,湖南长沙410128) 摘 要:研究了一类具有不连续激励函数的时滞Cohen—Grossbe神经网络,利用推广的Lyapunov方法,证明了 Filippov意义下解全局收敛到惟一的平衡点.在证明过程中使用了链式法.利用该法则可以计算不可微Lyapunov 函数对时间t沿右端不连续微分方程的导数. 关 键 词:Cohen—Grossberg神经网络;全局指数稳定;非线性方法;M一矩阵 中图分类号:O175.12 文献标识码:A conditions regular dynamics iS preserved in the 1 Introduction presence of a delay.In the literature,fundamental results have been established on dynamics such as The dynamics analysis of neural network models with delays has received a great deal of global stability and global exponential stability of the equilibrium point e.t.for delayed Cohen—Grossberg attention in some literatures .Due to the finite neural networks and cellular neural networks with switching speed of the neuron amplifiers,and the Lipschitz continuous neuron activationsI 捌. ifnite speed of signal propagation,delays are actually However.recent work has demonstrated the unavoidable in the electronic implementation. interest in studying regular dynamics such as global Moreover,delays are sometimes intentionally introd— stability of hte equilibrium point for neural networks uced to accomplish special tasks,such as motion with discontinuous neuron activations.This iS an detection via cellular neural networks .Therefore. ideal model for the case where the gain of the neuron it iS of importance to understand under which amplifiers is very high,and is frequently encountered Received:2007一O6—05 in the applications.Consider,for example,the classic Foundation item:Youth Science Foundation of HNAU Hopfield network where,under the standard assump— (07QN17) tion of high—gain amplifiers,the sigmoidal neuron Biography:LI Xu—meng(1979—1,male,Daoxian,Hunan, lecturer of HNAU. activations closely approach a discontinuous 维普资讯 http://www.cqvip.com 第33卷第6期 李绪孟等不连续激励函数时滞Cohen—Grossberg神经网络的动力学性质 375 hardcomparator function[ 培】.Furthermore,the analysis of the ideal discontinuous case is able to reveal crucial features of the dynamics,such as the presence of sliding modes along discontinuity surfaces,the phenomenon of convergence in finite time toward the equilibrium point,and the ability to compute the exact global minimum of the underlying energy function, which make these networks especially attractive for the solution of global OptimizatiOn pr0blems in real time【6・9- m. In this paper,we consider a class of Cohen— Grossberg neural network model with arbitrary constant delays in the neuron interconnections,and neuron activations modeled by a class of discon— tinuous monotone increasing and(possibly)unbou— nded functions. The model differs from those considered in the quoted papers on global stability of delayed neural networks,where smooth Lipschitz continuous activations are employed.It is also more general than the discontinuous neural network model in【 - 21.where the delay was assumed to be not present.This paper will establish some conditions on the neuron intercOnnectiOns under which the delayed and discOntinuOus neural networks in which the solution has a boundany and possesses a equilibrium point. 2 Notation and neural network model Notation 1 Given the column vector x 1,x2,…,Xn) ,where the prime denotes the trans— position,by x>0(respectively, I>0)we mean that xi>O(respectively,xi≥O)for all i:1,2,…,,z.If ∈R ,we have 1Ixll =f∑ Letlf R , >0. We define the n。m =l\l∑ 川f=l / or any x∈R .We consider a class of neural networks described by the system of differential equation 膏(f)=w( )(一 (f)+ ( (f))+A g(X(t—f))) (1) Wherex=(Xl,x2,…,Xn) ∈R is the vector ofneuron states;W( )=mag( ( ),…,Wn( )), wi( )>0;D=diag(d 一,d )is an constant diagonal matrix,where di>0 denotes the neuron self-inhibitions,i:1,2,…,n;A=(aij)and A =( ) rae,z×,z constant matrices which represent the neuron interconnection matrix and the delayed neuron interconnection matrix.respectively;and >0 is the constant delay in the neuron response.Moreover, g( )=(g ( ),…,g ( )) :R R is a diagonal mapping where gi,i=1,2,…,,z represents the neuron input—output activation.We suppose that the activations belong to the following set of discontinuous functions. Assumption 1 We suppose g ∈G for any f=1,2,…,,z,where Gdenotes the class offunctions from R to R which are monotone no decreasing and have at most a finite number of jump discontinuities in every compact interval and qs≥0. for Vqe K[gi( )】,Vs∈R,where K[g ( )】= CO{liar gi(s ):s s}. We note that if g satisfies Assumption 1,then nay g ,i=1,2,…, possesses only isolatedjump discontin uities where is not necessarily defined.Hencefor allx∈Rnwe have [g( )]= (Eg ( , ),一 ・,g ( :,x:l1). with respect to previous work on stability of delayed neural networks with Lipschitz continuous neuron activations,in(1)we break new ground by considering the presence of jump discontinuities in hte neuron activations.Model(1)is also more general than that of discOntinuOus neural networks considered in【10—1】】,due to the presence of a delay,and the fact that the neuron activations satisfying Assumption 1 rae allowed to be unbounded. 肫recall that unbounded nonlinearities are encountered in a number of neural network models.One particulraly interesting case corresponds to the networks for linear and quadratic programming problems introduced in. 3 Local existence of solutions Since(1)is a system of differential equations with discontinuous fight—hand side,we need to 维普资讯 http://www.cqvip.com 376 湖南农业大学学报(自然科学版) 2008年6月 specify what is meant by a solution of(1). Moreover,we need to introduce the notion of an I =w(工)(一D (f)+ ’,(f)+A r(t一 ))for乱a tE【0,T) {x(s)-Off) s卜 0】 l ( )= ( )for a.a sE【0,T) depend not only on the initial function output associated to a solution of(1). (4) Defm'ition 1 A function工:f- ,T)’÷R n, T∈【0,扣)is a solution of(1)on[- ̄-,T)if:i) is continuous on[_ 丁)and absolutely continuous onl— T);ii)There exists a measurable function We stress that the solutions of the IVP(4) but also on the selection of the output ̄(s)KI g(J)1.We y ( ,’‘ ) :[ ,such th砒 (f)∈ fg( (f))] for almost all(a,a)t∈【一 , )and (t)= )(一Dx(t)+A ( +A (t一 )) ofr a.a.tE[0,T)t∈[0, ) (2) Any function as in ii)is called an output associated to the solution x.Clearly, actually represents the vector of neural network out— puts.Observe that the above definition of solution to (1)implies that it(t) ̄W(x)(--Ox(t)-+-AK[g(x(t))]+ A K) 【f一 )】)for a-a.f∈【0, ),namely,x is a solution of(1)in the sense of Filippov .In the engineering applications,this definition is particu— lady useful,since it can be proved that solutions in the sense of Filippov are good approximation of solutions obtained when the neuron activations are Lipschitz functions with very high gain. Definition 2 An equilibrium point(EP)of(1) is a vector that ∈Rnsatisifeso —D +(A+A )K[占( )]. Equivalently,is an EP of(1)if there exists ∈R such that —De+(A+A ),7=0 (3) Any vector satisfying(3)is called an oHtput equilibrium point(OEP)corresponding to the EP .Since we are interested in studying the time— domain behavior of both the state x and the output ,it is convenient to give the next definition of an initial value problem(IVP)associated to(1). ,Definition 3 For any continuous function :卜 0】-÷R and any measurable selection :【一r,O]--4R ,such that ( )∈ [g( )]for a.a_ s∈[_ ,0】,by an IVP associated to(1)with initial condition , ),we mean the following problem: find a couple of functions[ ,y]:『一 ]’÷R xR such that x is a solution of(1)on卜 01 for some also observe that,since for any IVP the values x and are assigned in the intevral[一 ,0J,from now we will consider the solution【 ,),】of any IVP as deifned on[0,丁),丁∈(0, 】.Next we show that a local solution of an IVP exists by exploiting results on differential inclusions without delays. Let us fix a continuous initial function :【- ,0I R , select a measurable function :hO]-+R such that ( )∈Kl g( ( ))l ofr a.a. ∈f— ,01,and consider the differentia1 inclusion 叫 ’,(f)+ 卜 )f0 ra_a_ [0 )(5) lx(0)=妒(0) By theorem I in[14J,the inclusion has at least a solution defined in a fight neighborhood of zero.By the measurable selection theorem in[15】,we can ifnd a measurable function 7:J-÷R such that 7'0)∈Kl g(工(f))I for a.a.f∈J and ∈W( )・ (一Dx(t)+AT(t)+A (,~ )for a.a.t∈J,where we extended to the solution interval【 0J by letting o】= . Assume,as inductive step,which the solution is deifnedon[ Nr],for someN∈{1,2.2..),then,one can consider the vector =x(Nr)and proceed further by solving膏 w( )(一 (f)+A (f)+Ar (t-r)) for a.a. ∈J and【Ⅳ )= .hTus extending x (and I,too)on a right neighborhood of N Hence we proved the following. Theorem 1 Suppose that Assumption 1 is satisfied.Then,any IVP has at least a local solution [x, deifned on a maximal interval[0,T),for someT(0,扣). The next lemma will be useful to compute the time derivative along solutions of(1)of the 维普资讯 http://www.cqvip.com 第33卷第6期 李绪孟等不连续激励函数时滞Cohen—Grossberg神经网络的动力学性质 ,●●●●●●,f ●●●●● 377 Lyapunov function introduced in the next section. Then,There is a solution【x,),J for any IVP 一 一 on(0,+c一。),i.e.x is defined for t∈(r+∞)and = Lemma 1 Let[x, be a solution of an IVP, ),is defined orf t∈(0,桕). 十 + Proof:Let us consider f0r(1)the Lyapunov which is defined on[0, ,T∈(0,枷).Then,function  —一 (f) where =(fl ’‘,/3.)T>0,is absolutely function V[x, (・):[0,丁) R given by continu。us and剐 (f”lf=v(f)T (f)=姜 (峨(f) for a.atE[0, where (f)= sgn(xi(f)),when V Ux,y】(f)= fl, . . w( ( +f)) (s)lep(s+r)d f f1 xi≠0,while 1, (t)can be arbitrarily chosen in 卜一 , 】,when x =0【l6】I 4 Main results In this section.we establish some basic results on solution boundness and existence of equilibrium point of the state x and the output),of(1)under the next hypothesis involving matrices A and A‘. Given =in ( ( }, =sup{ ( )},andlet z ‘, i≠J Assumption 2 Suppose a <0fori=1,2,…, and c=( )is a M—matirx. Being C is an M—matrix,there is a positive vectorfl∈R ,such htat pTc>0;Forevery P≥0,let us considerthe,zx,zmatrix C =(cup)givenby -{蒜 , Being C0=C,by a continuity argument we have C >0 for all P∈[0,P0)where pc(0,arin{p,,P2,dl,…,dn}) (8) ie僻n n(】’…训, R)t( in∈ ‘f  {d ‘ ( )‘ )) n )(inf{di ( ))) (10) …The following result on global stability of the EP of(1)holds. Theorem 2 Suppose that Assumptions 1 and Assumptions 2 are satisifed and wi(xi)> >0. l ~w( ( )) ( )l e … d + ept (r) Vt∈ [0,T) which is defined on the couples[ ,),],where x:[0,丁) R is locally Lipschitz continuous and 丫:[ 丁) R is locally integrable(we do not exclude the case丁=枷).Function7:【 丁) R” is absolutely continuous and its derivative can be evaluated by mean of Lemma 1.In particular,we haVe ( =喜 (r)毫(r)f0… 【 ).In which v (f)=flisgn(xi( ),ifxi(t):/:O;flisgn(xi(t)), ifxi(t):/:O; (f):层s ( (f)),if (t):0and ≠0; (t)=0,if = (t)=0.We remark that with this choice,if【x,),J is a solution of the given IVP, we have vi(f) (t)= f (f)I and vi(t) (t)= (t) for a.a.te[0, ). Assume that x is defined up to T>0 and evaluate the derivative of V[x, 】(・)for a.a 0<t<T, combining with(6)-(10),we have [ , 】(f)≤_ept芝( ( (f))一 )屈f (f)f+ 萎 (w (_( +wi( ( ) aijr陋 )『+ 季 w。( ( ))(a 一a q" r—f)I≤ 一e一e p n ∑(l= (J   fd w ( ()t )一 Ixit))I’ + ne ( ( (r))aii+ePrwi(xi(r+f)) + 砉( ( (r+ePrwi( (r+f))[aij ̄1)lfi lTj(r)f≤ 维普资讯 http://www.cqvip.com 378 湖南农业大学学报(自然科学版) 2008年6月 一一e ( ( (,))一 )属 (,)f-e"p (I (,)I…I (,)I) ≤o leads to References: Venetianer P L,Roska T Imagine compression by delayed An integration between 0 and CNN s【JJ.Circuits Syst I,1998,45:205—215.  periodic 【2】 Zhang Z Z,Zhou T J.Existence of almostsolutions for a class of shunting inhibitory cellular neural I o) ≤ [ ,/p](t)e一 ≤ [z,3,】(0)e一 < 【 , ](0), vt<T.Hence since remains bounded for >0,it is defined on[0,十oo). Theorem 3 Suppose that Assumptions 1 and Assumptions 2 are satisifed and wi( )>£>0. Then,system(1)has a unique EP and a corresponding OEP at least. Proof:From Assumption 1,each activation function g in(1)satisifesqs>0,Vq∈K[gi( )】, Vs∈R.It Call be easily seen that =O,叩=Ois a EP and a corresponding OEP of VIP,and =0 is unique EP from the end of the proof of Theorem 3. Theorem 4 Suppose that Assumptions 1 and Assumptions 2 are satisfied and wf( )>£>0. hTen,the EP =0 is globally exponentially stable,the convergence rate is P; Proof."From the Theorem 3,we have =0 as an EP.And from the end ofthe proofofTheorem 2, we have:V[x,rl(t)≤一eatX(d。wi( (f))一 ) f (f)f— ept W( )c (It (,)1.. _(f)I) ≤o・ An integration between 0 and t leads to IIx(')ll ≤ 【 , ](o)e ,vt>o・This estimate implies that the EP :0 is globally exponentially stable,the convergence rate is P. Example:Consider the second secdon・-order neu。— ral network(1)defined by matrices D=diag(0.01,0.01) and W( ):(1+0.01 lsin(x1)l,1+0,02 lCOS(X2)1)and A::1J 啷- 485 -0 .3132 ],J … 4-}一0. 2 00 ..J4.2] ’ let =2X and gJ( )=g2( )=100+1.9,if >0,gl( ): g2( =100+0.1 if <0.It is seen that :0 as an EP,with corresponding OEP :0.It can also be readily verified that the assumption of Theorem 4 in which :(1.3,1.0) ,P:0.01. networks[J[.Journal of Hunan Agricultural University: Namrla Sciences,2006,32(3):330.332.(in chinese) 【3】 Wang L,Zou X.Exponentila stabiifty of Cohen—Grossberg neurla networks[J】.Neurla Networks,20o2,15:415-422. 【4】 Chen T P,Libin Rong.Robust global exponentila stability of Cohen—Grossberg neurla networks with time delays[J[. Neural Networks,2OO4,15:203.206. 【5】 Belair J.Stability in a model of a delayed neural networks [J】.Dynam Diferential Equations,1 993,15:607.623. 【6】 Forti M,Nistri R Globle convergence of neural networks with discontinous neuron activations[J1.Circuits Syst I, 20o3.50:1421.1435. 【7】 Hopfield J.Neurons with graded response have collective computational properties like those of two—state neurons 【J】.Proc NatAcad Sci,1984,81:3088—3092. 【8】 Li JH,MichelAN,PorodW.Analysis and synthesis ofa class of neural networks:Variable structure systems witl1 infinte gain[J[.Circuits Syst I,1989,36:713—731. [9】 Chong E K P.Hui S.Zak S H.An analysis ofaclass of neural networks for solving linear programming problems 『J1.Congo1,l999,44:1995—2oo6. 【10】 Forti M.Nistri P,Quincampoix M.Generalized neural networks for nonsmooth nonlinear programming problems[J[.Circuits Syst I.2004.51:1741—1754. Forti M,Nistri P,Papini D.Global exponential stability nad global convergence in finite time of delayed neural networks with infinite gain[J[.Neural Networks,20o5, 16(6):1449—1463. 【12】 Lu W L.Chen T R Dynamical behaviors of Cohen— Grossberg neural networks with discontinuous activation functions[J[.Neurla Networks,20o5,18(3):231-242. [13】 Kennedy M P,Chua L O.Neural networks for nonlinear programming[J[.Circuits Syst I,1998,35:554—562. [14】 Filippov A E Differential Equations With Discontinuous Right—Hand Side[M】.Boston:Kluwer Academic,1988. 【15】 Aubin J P,Cellina A.Diferentila Inclusions[M】.Berlin: Springer-Verlag,1984. [16】 Foni M.Nistri P.Global convergence of neural networks witl1 discontinuous neuron activations[J[.Circuits Syst I,2003,50:1421—1435. 责任编辑:王赛群 英文编辑:罗文翠 

因篇幅问题不能全部显示,请点此查看更多更全内容