首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Bartels–Stewart algorithm is an effective and widely used method with an O(n 3) time complexity for solving a static Sylvester equation. When applied to time-varying Sylvester equation, the computation burden increases intensively with the decrease of sampling period and cannot satisfy continuous realtime calculation requirements. Gradient-based recurrent neural network are able to solve the time-varying Sylvester equation in real time but there always exists an estimation error. In contrast, the recently proposed Zhang neural network has been proven to converge to the solution of the Sylvester equation ideally when time goes to infinity. However, this neural network with the suggested activation functions never converges to the desired value in finite time, which may limit its applications in realtime processing. To tackle this problem, a sign-bi-power activation function is proposed in this paper to accelerate Zhang neural network to finite-time convergence. The global convergence and finite-time convergence property are proven in theory. The upper bound of the convergence time is derived analytically. Simulations are performed to evaluate the performance of the neural network with the proposed activation function. In addition, the proposed strategy is applied to online calculating the pseudo-inverse of a matrix and nonlinear control of an inverted pendulum system. Both theoretical analysis and numerical simulations validate the effectiveness of proposed activation function.  相似文献   

2.
A new kind of recurrent neural network is presented for solving the Lyapunov equation with time-varying coefficient matrices. Different from other neural-computation approaches, the neural network is developed by following Zhang et al.'s design method, which is capable of solving the time-varying Lyapunov equation. The resultant Zhang neural network (ZNN) with implicit dynamics could globally exponentially converge to the exact time-varying solution of such a Lyapunov equation. Computer-simulation results substantiate that the proposed recurrent neural network could achieve much superior performance on solving the Lyapunov equation with time-varying coefficient matrices, as compared to conventional gradient-based neural networks (GNN).  相似文献   

3.
Differing from gradient-based neural networks (GNN), a special kind of recurrent neural network has recently been proposed by Zhang et al. for real-time inversion of time-varying matrices. The design of such a recurrent neural network is based on a matrix-valued error function instead of a scalar-valued norm-based energy-function. In addition, it is depicted in an implicit dynamics instead of an explicit dynamics. This paper investigates the simulation and verification of such a Zhang neural network (ZNN). Four important simulation techniques are employed to simulate this system: (1) Kronecker product of matrices is introduced to transform a matrix-differential-equation (MDE) to a vector differential equation (VDE) [i.e., finally, there is a standard ordinary-differential-equation (ODE) formulation]. (2) MATLAB routine “ode45” with a mass-matrix property is introduced to simulate the transformed initial-value implicit ODE system. (3) Matrix derivatives are obtained using the routine “diff” and symbolic math toolbox. (4) Various implementation errors and different types of activation functions are investigated, further demonstrating the advantages of the ZNN model. Three illustrative computer-simulation examples substantiate the theoretical results and efficacy of the ZNN model for online time-varying matrix inversion.  相似文献   

4.
This paper presents new stability results for recurrent neural networks with Markovian switching. First, algebraic criteria for the almost sure exponential stability of recurrent neural networks with Markovian switching and without time delays are derived. The results show that the almost sure exponential stability of such a neural network does not require the stability of the neural network at every individual parametric configuration. Next, both delay-dependent and delay-independent criteria for the almost sure exponential stability of recurrent neural networks with time-varying delays and Markovian-switching parameters are derived by means of a generalized stochastic Halanay inequality. The results herein include existing ones for recurrent neural networks without Markovian switching as special cases. Finally, simulation results in three numerical examples are discussed to illustrate the theoretical results.  相似文献   

5.
In this paper, fixed-final time optimal control laws using neural networks and HJB equations for general affine in the input nonlinear systems are proposed. The method utilizes Kronecker matrix methods along with neural network approximation over a compact set to solve a time-varying HJB equation. The result is a neural network feedback controller that has time-varying coefficients found by a priori offline tuning. Convergence results are shown. The results of this paper are demonstrated on an example.  相似文献   

6.
Zeng Z  Wang J 《Neural computation》2007,19(8):2149-2182
In this letter, some sufficient conditions are obtained to guarantee recurrent neural networks with linear saturation activation functions, and time-varying delays have multiequilibria located in the saturation region and the boundaries of the saturation region. These results on pattern characterization are used to analyze and design autoassociative memories, which are directly based on the parameters of the neural networks. Moreover, a formula for the numbers of spurious equilibria is also derived. Four design procedures for recurrent neural networks with linear saturation activation functions and time-varying delays are developed based on stability results. Two of these procedures allow the neural network to be capable of learning and forgetting. Finally, simulation results demonstrate the validity and characteristics of the proposed approach.  相似文献   

7.
In this work, a novel method, based upon Hopfield neural networks, is proposed for parameter estimation, in the context of system identification. The equation of the neural estimator stems from the applicability of Hopfield networks to optimization problems, but the weights and the biases of the resulting network are time-varying, since the target function also varies with time. Hence the stability of the method cannot be taken for granted. In order to compare the novel technique and the classical gradient method, simulations have been carried out for a linearly parameterized system, and results show that the Hopfield network is more efficient than the gradient estimator, obtaining lower error and less oscillations. Thus the neural method is validated as an on-line estimator of the time-varying parameters appearing in the model of a nonlinear physical system.  相似文献   

8.
通常的递归神经网络计算方法采用渐近收敛的网络模型,误差函数渐近收敛于零,理论上需经过无穷长的计算时间才能获得被求解问题的精确解。文中提出了一种终态递归神经网络模型,该网络形式新颖,具有有限时间收敛特性,用于解决时变矩阵计算问题时可使得计算过程快速收敛,且计算精度高。该网络的另一特点是动态方程右端函数值有限,易于实现。首先,分析渐近收敛网络模型在时变计算问题求解方面的缺陷,说明引入终态网络模型的必要性;然后,给出终态网络动态方程,推导出该网络收敛时间的具体表达式。对于时变矩阵逆和广义逆求解,定义一个误差函数,并依据误差函数构造终态递归神经网络进行求解,使计算过程在有限时间内收敛便能得到精确解。在将任意初始位置下的冗余机械臂轨迹规划任务转换为二次规划问题后,利用所提出的神经网络进行计算,得出的关节角轨迹导致末端执行器完成封闭轨迹跟踪,且关节角严格返回初始位置,以实现可重复运动。使用MATLAB/SIMULINK对时变矩阵计算问题和机器人轨迹规划任务分别进行仿真,通过比较分别采用渐近网络模型和终态网络模型时的计算过程与结果可以看出,使用终态网络模型的计算过程收敛快且显著提高了计算精度。对不同时变计算问题的求解体现了所提神经网络的应用背景。  相似文献   

9.
In view of the great potential in parallel processing and ready implementation via hardware, neural networks are now often employed to solve online nonlinear matrix equation problems. Recently, a novel class of neural networks, termed Zhang neural network (ZNN), has been formally proposed by Zhang et al. for solving online time-varying problems. Such a neural-dynamic system is elegantly designed by defining an indefinite matrix-valued error-monitoring function, which is called Zhang function (ZF). The dynamical system is then cast in the form of a first-order differential equation by using matrix notation. In this paper, different indefinite ZFs, which lead to different ZNN models, are proposed and developed as the error-monitoring functions for time-varying matrix square roots finding. Towards the final purpose of field programmable gate array (FPGA) and application-specific integrated circuit (ASIC) realization, the MATLAB Simulink modeling and verifications of such ZNN models are further investigated for online solution of time-varying matrix square roots. Both theoretical analysis and modeling results substantiate the efficacy of the proposed ZNN models for time-varying matrix square roots finding.  相似文献   

10.
This paper presents new theoretical results on global exponential stability of recurrent neural networks with bounded activation functions and time-varying delays. The stability conditions depend on external inputs, connection weights, and time delays of recurrent neural networks. Using these results, the global exponential stability of recurrent neural networks can be derived, and the estimated location of the equilibrium point can be obtained. As typical representatives, the Hopfield neural network (HNN) and the cellular neural network (CNN) are examined in detail.  相似文献   

11.
Following the idea of using first-order time derivatives, this paper presents a general recurrent neural network (RNN) model for online inversion of time-varying matrices. Different kinds of activation functions are investigated to guarantee the global exponential convergence of the neural model to the exact inverse of a given time-varying matrix. The robustness of the proposed neural model is also studied with respect to different activation functions and various implementation errors. Simulation results, including the application to kinematic control of redundant manipulators, substantiate the theoretical analysis and demonstrate the efficacy of the neural model on time-varying matrix inversion, especially when using a power-sigmoid activation function.  相似文献   

12.
Gong  Jianqiang  Jin  Jie 《Neural Processing Letters》2021,53(5):3591-3606
Neural Processing Letters - In this paper, a new zeroing neural network (NZNN) with a new activation function (AF) is presented and investigated for solving dynamic Sylvester equation (DSE). The...  相似文献   

13.
The polynomial Diophantine matrix equation and the generalized Sylvester matrix equation are important for controller design in frequency domain linear system theory and time domain linear system theory, respectively. By using the so-called generalized Sylvester mapping, right coprime factorization and Bezout identity associated with certain polynomial matrices, we present in this note a unified parametrization for the solutions to both of these two classes of matrix equations. Moreover, it is shown that solutions to the generalized Sylvester matrix equation can be obtained if solutions to the Diophantine matrix equation are available. The results disclose a relationship between the polynomial Diophantine matrix equation and generalized Sylvester matrix equation that are respectively studied and used in frequency domain linear system theory and time domain linear system theory.  相似文献   

14.
In addition to the parallel-distributed nature, recurrent neural networks can be implemented physically by designated hardware and thus have been found broad applications in many fields. In this paper, a special class of recurrent neural network named Zhang neural network (ZNN), together with its electronic realization, is investigated and exploited for online solution of time-varying linear matrix equations. By following the idea of Zhang function (i.e., error function), two ZNN models are proposed and studied, which allow us to choose plentiful activation functions (e.g., any monotonically-increasing odd activation function). It is theoretically proved that such two ZNN models globally and exponentially converge to the theoretical solution of time-varying linear matrix equations when using linear activation functions. Besides, the new activation function, named Li activation function, is exploited. It is theoretically proved that, when using Li activation function, such two ZNN models can be further accelerated to finite-time convergence to the time-varying theoretical solution. In addition, the upper bound of the convergence time is derived analytically via Lyapunov theory. Then, we conduct extensive simulations using such two ZNN models. The results substantiate the theoretical analysis and the efficacy of the proposed ZNN models for solving time-varying linear matrix equations.  相似文献   

15.

The target of this article is to study almost periodic dynamical behaviors for complex-valued recurrent neural networks with discontinuous activation functions and time-varying delays. We construct an equivalent discontinuous right-hand equation by decomposing real and imaginary parts of complex-valued neural networks. Based on differential inclusions theory, diagonal dominant principle and nonsmooth analysis theory of generalized Lyapunov function method, we achieve the existence, uniqueness and global stability of almost periodic solution for the equivalent delayed differential network. In particular, we derive a series of results on the equivalent neural networks with discontinuous activation functions, constant coefficients as well as periodic coefficients, respectively. Finally, we give a numerical example to demonstrate the effectiveness and feasibility of the derived theoretical results.

  相似文献   

16.
Kong  Ying  Hu  Tanglong  Lei  Jingsheng  Han  Renji 《Neural Processing Letters》2022,54(1):125-144
Neural Processing Letters - Zhang neural network (ZNN), a special recurrent neural network, has recently been established as an effective alternative for time-varying linear equations with...  相似文献   

17.
当神经网络应用于最优化计算时,理想的情形是只有一个全局渐近稳定的平衡点,并且以指数速度趋近于平衡点,从而减少神经网络所需计算时间.研究了带时变时滞的递归神经网络的全局渐近稳定性.首先将要研究的模型转化为描述系统模型,然后利用Lyapunov-Krasovskii稳定性定理、线性矩阵不等式(LMI)技术、S过程和代数不等式方法,得到了确保时变时滞递归神经网络渐近稳定性的新的充分条件,并将它应用于常时滞神经网络和时滞细胞神经网络模型,分别得到了相应的全局渐近稳定性条件.理论分析和数值模拟显示,所得结果为时滞递归神经网络提供了新的稳定性判定准则.  相似文献   

18.
本文针对一类带有混合时变时滞 (离散和分布时滞)的区间递归神经网络进行了全局鲁棒稳定性研究.与之前的处理方法不同,在本文中通过使用一种新型的增广Lyapunov-Krasovskii泛函,从而得到了一类新颖的关于区间递归神经网络的时滞依赖全局鲁棒稳定性判据.在新的增广泛函中,由于首次使用了带有激活函数的积分项,系统状态和激活函数之间的关系将被更好地表示出来.因此,本文提出的判据具有更小的保守性.同时,在本文提出的判据中,放松了时变时滞变化率必须小于1的限制.仿真结果进一步证明了本文结果的有效性.  相似文献   

19.
This paper studies stationary oscillation for a time-varying recurrent cellular neural network with time delays and impulses. In a recent paper, the authors claim that they obtain a criterion of existence, uniqueness, and global exponential stability of periodic solution (i.e. stationary oscillation) for a recurrent cellular neural network with time delays and impulses. We point out that the main result of their paper is incorrect, and present a sufficient condition of stationary oscillation for a time-varying recurrent cellular neural networks with time delays and impulses. An numerical example is given to illustrate the effectiveness of the obtained result.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号