首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper introduces a general class of neural networks with arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. The discontinuities in the activations are an ideal model of the situation where the gain of the neuron amplifiers is very high and tends to infinity, while the delay accounts for the finite switching speed of the neuron amplifiers, or the finite signal propagation speed. It is known that the delay in combination with high-gain nonlinearities is a particularly harmful source of potential instability. The goal of this paper is to single out a subclass of the considered discontinuous neural networks for which stability is instead insensitive to the presence of a delay. More precisely, conditions are given under which there is a unique equilibrium point of the neural network, which is globally exponentially stable for the states, with a known convergence rate. The conditions are easily testable and independent of the delay. Moreover, global convergence in finite time of the state and output is investigated. In doing so, new interesting dynamical phenomena are highlighted with respect to the case without delay, which make the study of convergence in finite time significantly more difficult. The obtained results extend previous work on global stability of delayed neural networks with Lipschitz continuous neuron activations, and neural networks with discontinuous neuron activations but without delays.  相似文献   

2.
In this paper, we study a general class of neural networks with discrete and distributed time-varying delays, whose neuron activations are discontinuous and may be unbounded or nonmonotonic. By using the Leray-Schauder alternative theorem in multivalued analysis, matrix theory and generalized Lyapunov-like approach, we obtain some sufficient conditions ensuring the existence, uniqueness and global asymptotic stability of the periodic solution. Moreover, when all the variable coefficients and time delays are real constants, we discuss the global convergence in finite time of the neural network dynamical system. Our results extend previous works not only on discrete and distributed time-varying delayed neural networks with continuous or even Lipschitz continuous activations, but also on discrete and distributed time-varying delayed neural networks with discontinuous activations. Two numerical examples are given to illustrate the effectiveness of our main results.  相似文献   

3.
This paper considers a class of delayed neural networks with discontinuous neuron activations. Based on the theory of differential equations with discontinuous right‐hand sides, some novel sufficient conditions are derived that ensure the existence and global exponential stability of the equilibrium point. Moreover, by adopting the concept of convergence in measure, convergence behavior for the output is discussed. The obtained results are independent of the delay parameter and can be thought of as a generalization of previous results established for delayed neural networks with Lipschtz continuous neuron activations to the discontinuous case. Finally, we give a numerical example to illustrate the effectiveness and novelty of our results by comparing our results with those in the early literature.  相似文献   

4.
Liping  Lihong 《Neurocomputing》2009,72(16-18):3726
This paper investigates a class of delayed neural networks whose neuron activations are modeled by discontinuous functions. By utilizing the Leray–Schauder fixed point theorem of multivalued version, the properties of M-matrix and generalized Lyapunov approach, we present some sufficient conditions to ensure the existence and global asymptotic stability of the state equilibrium point. Furthermore, the global convergence of the output solutions are also discussed. The assumptive conditions imposed on activation functions are allowed to be unbounded and nonmonotonic, which are less restrictive than previews works on the discontinuous or continuous neural networks. Hence, we improve and extend some existing results of other researchers. Finally, one numerical example is given to illustrate the effectiveness of the criteria proposed in this paper.  相似文献   

5.
In this paper, we consider a general class of neural networks, which have arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. Based on the topological degree theory and Lyapunov functional method, we provide some new sufficient conditions for the global exponential stability and global convergence in finite time of these delayed neural networks. Under these conditions the uniqueness of initial value problem (IVP) is proved. The exponential convergence rate can be quantitatively estimated on the basis of the parameters defining the neural network. These conditions are easily testable and independent of the delay. In the end some remarks and examples are discussed to compare the present results with the existing ones.  相似文献   

6.
This paper studies the output convergence of a class of recurrent neural networks with time-varying inputs. The model of the studied neural networks has different dynamic structure from that in the well known Hopfield model, it does not contain linear terms. Since different structures of differential equations usually result in quite different dynamic behaviors, the convergence of this model is quite different from that of Hopfield model. This class of neural networks has been found many successful applications in solving some optimization problems. Some sufficient conditions to guarantee output convergence of the networks are derived.  相似文献   

7.
This paper is concerned with a class of neutral-type neural networks with discontinuous activations and time-varying delays. Under the concept of Filippov solution, by applying the differential inclusions and the topological degree theory in set-valued analysis, we employ a novel argument to establish new results on the existence of the periodic solutions for the considered neural networks. After that, we derive some criteria on the uniqueness, global exponential stability of the considered neural networks and convergence of the corresponding autonomous case of the considered neural networks, in terms of nonsmooth analysis theory with Lyapunov-like approach. Without assuming the boundedness (or the growth condition) and monotonicity of the discontinuous neuron activation functions, the results obtained can also be valid. Our results extend previous works on the neutral-type neural networks to the discontinuous cases, some related results in the literature can be enriched and extended. Finally, two typical examples and the corresponding numerical simulations are provided to show the effectiveness and flexibility of the results derived in this paper.  相似文献   

8.
In this paper, the exponential stabilisation problem is studied for a general class of memristive time-varying delayed neural networks under periodically intermittent output feedback control. First, the periodically intermittent output feedback control rule is designed for the exponential stabilisation of the memristive time-varying delayed neural networks. Then, we derive stabilisation criteria so that the memristive time-varying delayed neural networks are exponentially stable. By the mathematical induction method and constructing suitable Lyapunov–Krasovskii functionals, some easy-to-check criteria are obtained to ensure the exponential stabilisation of memristive time-varying delayed neural networks. Finally, two numerical simulation examples are given to illustrate the validity of the obtained results.  相似文献   

9.
This paper presents new theoretical results on global exponential stability of recurrent neural networks with bounded activation functions and time-varying delays. The stability conditions depend on external inputs, connection weights, and time delays of recurrent neural networks. Using these results, the global exponential stability of recurrent neural networks can be derived, and the estimated location of the equilibrium point can be obtained. As typical representatives, the Hopfield neural network (HNN) and the cellular neural network (CNN) are examined in detail.  相似文献   

10.
Normal fuzzy CMAC neural network performs well for nonlinear systems identification because of its fast learning speed and local generalization capability for approximating nonlinear functions. However, it requires huge memory and the dimension increases exponentially with the number of inputs. It is difficult to model dynamic systems with static fuzzy CMACs. In this paper, we use two types of recurrent techniques for fuzzy CMAC to overcome the above problems. The new CMAC neural networks are named recurrent fuzzy CMAC (RFCMAC) which add feedback connections in the inner layers (local feedback) or the output layer (global feedback). The corresponding learning algorithms have time-varying learning rates, the stabilities of the neural identifications are proven.  相似文献   

11.
讨论了一类广义时变时滞递归神经网络的平衡点的存在性、唯一性和全局指数稳定性。这个神经网络模型包括时滞Hopfield神经网络,时滞Cellular神经网络,时滞Cohen-Grossberg神经网络作为特例。基于微分不等式技术,利用Brouwer不动点定理并构造合适的Lyapunov函数,得到了保证递归神经网络的平衡点存在、唯一、全局指数稳定的新的充分条件。新的充分条件不要求激励函数的可微性、有界性和单调性,同时减少了对限制条件的要求。两个仿真例子表明了所得结果的有效性。  相似文献   

12.
In a recent work, a new method has been introduced to analyze complete stability of the standard symmetric cellular neural networks (CNNs), which are characterized by local interconnections and neuron activations modeled by a three-segment piecewise-linear (PWL) function. By complete stability it is meant that each trajectory of the neural network converges toward an equilibrium point. The goal of this paper is to extend that method in order to address complete stability of the much wider class of symmetric neural networks with an additive interconnecting structure where the neuron activations are general PWL functions with an arbitrary number of straight segments. The main result obtained is that complete stability holds for any choice of the parameters within the class of symmetric additive neural networks with PWL neuron activations, i.e., such a class of neural networks enjoys the important property of absolute stability of global pattern formation. It is worth pointing out that complete stability is proved for generic situations where the neural network has finitely many (isolated) equilibrium points, as well as for degenerate situations where there are infinite (nonisolated) equilibrium points. The extension in this paper is of practical importance since it includes neural networks useful to solve significant signal processing tasks (e.g., neural networks with multilevel neuron activations). It is of theoretical interest too, due to the possibility of approximating any continuous function (e.g., a sigmoidal function), using PWL functions. The results in this paper confirm the advantages of the method of Forti and Tesi, with respect to LaSalle approach, to address complete stability of PWL neural networks.  相似文献   

13.
Deep neural networks such as GoogLeNet, ResNet, and BERT have achieved impressive performance in tasks such as image and text classification. To understand how such performance is achieved, we probe a trained deep neural network by studying neuron activations, i.e.combinations of neuron firings, at various layers of the network in response to a particular input. With a large number of inputs, we aim to obtain a global view of what neurons detect by studying their activations. In particular, we develop visualizations that show the shape of the activation space, the organizational principle behind neuron activations, and the relationships of these activations within a layer. Applying tools from topological data analysis, we present TopoAct , a visual exploration system to study topological summaries of activation vectors. We present exploration scenarios using TopoAct that provide valuable insights into learned representations of neural networks. We expect TopoAct to give a topological perspective that enriches the current toolbox of neural network analysis, and to provide a basis for network architecture diagnosis and data anomaly detection.  相似文献   

14.
This brief paper presents an M-matrix-based algebraic criterion for the global exponential stability of a class of recurrent neural networks with decreasing time-varying delays. The criterion improves some previous criteria based on M-matrix and is easy to be verified with the connection weights of the recurrent neural networks with decreasing time-varying delays. In addition, the rate of exponential convergence can be estimated via a simple computation based on the criterion herein.  相似文献   

15.
Oscillatory and synchronized neural activities are commonly found in the brain, and evidence suggests that many of them are caused by global feedback. Their mechanisms and roles in information processing have been discussed often using purely feedforward networks or recurrent networks with constant inputs. On the other hand, real recurrent neural networks are abundant and continually receive information-rich inputs from the outside environment or other parts of the brain. We examine how feedforward networks of spiking neurons with delayed global feedback process information about temporally changing inputs. We show that the network behavior is more synchronous as well as more correlated with and phase-locked to the stimulus when the stimulus frequency is resonant with the inherent frequency of the neuron or that of the network oscillation generated by the feedback architecture. The two eigenmodes have distinct dynamical characteristics, which are supported by numerical simulations and by analytical arguments based on frequency response and bifurcation theory. This distinction is similar to the class I versus class II classification of single neurons according to the bifurcation from quiescence to periodic firing, and the two modes depend differently on system parameters. These two mechanisms may be associated with different types of information processing.  相似文献   

16.
过程神经元网络及其在时变信息处理中的应用   总被引:7,自引:1,他引:6  
针对时变信息处理和动态系统建模等类问题,建立了输入输出均为时变函数的过程神经元网络和有理式过程神经元网络2种网络模型.在输入输出为时变函数的过程神经元网络中,过程神经元的时间累积算子取为对时间的积分或其他代数运算,它的时空聚合机制和激励能同时反映外部时变输入信号对输出结果的空间聚合作用和时间累积效应,可实现非线性系统输入、输出之间的复杂映射关系.在有理式过程神经元网络中,其基本信息处理单元为由2个成对偶出现的过程神经元组成,逻辑上分为分子和分母2部分,通过有理式整合后输出,可有效提高过程神经元网络对带有奇异值过程函数的柔韧逼近性和在奇异值点附近反应的灵敏性.分析了2种过程神经元网络模型的性质,给出了具体学习算法,并以油田开发过程模拟和旋转机械故障诊断问题为例,验证了这2种网络模型在时变信息处理中的有效性.  相似文献   

17.
State estimation for delayed neural networks   总被引:4,自引:0,他引:4  
In this letter, the state estimation problem is studied for neural networks with time-varying delays. The interconnection matrix and the activation functions are assumed to be norm-bounded. The problem addressed is to estimate the neuron states, through available output measurements, such that for all admissible time-delays, the dynamics of the estimation error is globally exponentially stable. An effective linear matrix inequality approach is developed to solve the neuron state estimation problem. In particular, we derive the conditions for the existence of the desired estimators for the delayed neural networks. We also parameterize the explicit expression of the set of desired estimators in terms of linear matrix inequalities (LMIs). Finally, it is shown that the main results can be easily extended to cope with the traditional stability analysis problem for delayed neural networks. Numerical examples are included to illustrate the applicability of the proposed design method.  相似文献   

18.
In this paper, we present the analytical results on the global exponential periodicity of a class of recurrent neural networks with oscillating parameters and time-varying delays. Sufficient conditions are derived for ascertaining the existence, uniqueness and global exponential periodicity of the oscillatory solution of such recurrent neural networks by using the comparison principle and mixed monotone operator method. The periodicity results extend or improve existing stability results for the class of recurrent neural networks with and without time delays.  相似文献   

19.
In this paper, we introduce some ideas of switched systems into the field of neural networks and a large class of switched recurrent neural networks (SRNNs) with time-varying structured uncertainties and time-varying delay is investigated. Some delay-dependent robust periodicity criteria guaranteeing the existence, uniqueness, and global asymptotic stability of periodic solution for all admissible parametric uncertainties are devised by taking the relationship between the terms in the Leibniz-Newton formula into account. Because free weighting matrices are used to express this relationship and the appropriate ones are selected by means of linear matrix inequalities (LMIs), the criteria are less conservative than existing ones reported in the literature for delayed neural networks with parameter uncertainties. Some examples are given to show that the proposed criteria are effective and are an improvement over previous ones.  相似文献   

20.

The target of this article is to study almost periodic dynamical behaviors for complex-valued recurrent neural networks with discontinuous activation functions and time-varying delays. We construct an equivalent discontinuous right-hand equation by decomposing real and imaginary parts of complex-valued neural networks. Based on differential inclusions theory, diagonal dominant principle and nonsmooth analysis theory of generalized Lyapunov function method, we achieve the existence, uniqueness and global stability of almost periodic solution for the equivalent delayed differential network. In particular, we derive a series of results on the equivalent neural networks with discontinuous activation functions, constant coefficients as well as periodic coefficients, respectively. Finally, we give a numerical example to demonstrate the effectiveness and feasibility of the derived theoretical results.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号