首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
An ε-twin support vector machine for regression   总被引:1,自引:1,他引:0  
A special class of recurrent neural network termed Zhang neural network (ZNN) depicted in the implicit dynamics has recently been introduced for online solution of time-varying convex quadratic programming (QP) problems. Global exponential convergence of such a ZNN model is achieved theoretically in an error-free situation. This paper investigates the performance analysis of the perturbed ZNN model using a special type of activation functions (namely, power-sum activation functions) when solving the time-varying QP problems. Robustness analysis and simulation results demonstrate the superior characteristics of using power-sum activation functions in the context of large ZNN-implementation errors, compared with the case of using linear activation functions. Furthermore, the application to inverse kinematic control of a redundant robot arm also verifies the feasibility and effectiveness of the ZNN model for time-varying QP problems solving.  相似文献   

2.
This paper presents and investigates the application of Zhang neural network (ZNN) activated by Li function to kinematic control of redundant robot manipulators via time-varying Jacobian matrix pseudoinversion. That is, by using Li activation function and by computing the time-varying pseudoinverse of the Jacobian matrix (of the robot manipulator), the resultant ZNN model is applied to redundant-manipulator kinematic control. Note that there are nine novelties and differences of ZNN from the conventional gradient neural network in the research methodology. More importantly, such a Li-function activated ZNN (LFAZNN) model has the property of finite-time convergence (showing its feasibility to redundant-manipulator kinematic control). Simulation results based on a four-link planar robot manipulator and a PA10 robot manipulator further demonstrate the effectiveness of the presented LFAZNN model, as well as show the LFAZNN application prospect.  相似文献   

3.
In view of the great potential in parallel processing and ready implementation via hardware, neural networks are now often employed to solve online nonlinear matrix equation problems. Recently, a novel class of neural networks, termed Zhang neural network (ZNN), has been formally proposed by Zhang et al. for solving online time-varying problems. Such a neural-dynamic system is elegantly designed by defining an indefinite matrix-valued error-monitoring function, which is called Zhang function (ZF). The dynamical system is then cast in the form of a first-order differential equation by using matrix notation. In this paper, different indefinite ZFs, which lead to different ZNN models, are proposed and developed as the error-monitoring functions for time-varying matrix square roots finding. Towards the final purpose of field programmable gate array (FPGA) and application-specific integrated circuit (ASIC) realization, the MATLAB Simulink modeling and verifications of such ZNN models are further investigated for online solution of time-varying matrix square roots. Both theoretical analysis and modeling results substantiate the efficacy of the proposed ZNN models for time-varying matrix square roots finding.  相似文献   

4.
Zhang neural networks (ZNN), a special kind of recurrent neural networks (RNN) with implicit dynamics, have recently been introduced to generalize to the solution of online time-varying problems. In comparison with conventional gradient-based neural networks, such RNN models are elegantly designed by defining matrix-valued indefinite error functions. In this paper, we generalize, investigate and analyze ZNN models for online time-varying full-rank matrix Moore–Penrose inversion. The computer-simulation results and application to inverse kinematic control of redundant robot arms demonstrate the feasibility and effectiveness of ZNN models for online time-varying full-rank matrix Moore–Penrose inversion.  相似文献   

5.
Kong  Ying  Hu  Tanglong  Lei  Jingsheng  Han  Renji 《Neural Processing Letters》2022,54(1):125-144
Neural Processing Letters - Zhang neural network (ZNN), a special recurrent neural network, has recently been established as an effective alternative for time-varying linear equations with...  相似文献   

6.
Differing from gradient-based neural networks (GNN), a special kind of recurrent neural network has recently been proposed by Zhang et al. for real-time inversion of time-varying matrices. The design of such a recurrent neural network is based on a matrix-valued error function instead of a scalar-valued norm-based energy-function. In addition, it is depicted in an implicit dynamics instead of an explicit dynamics. This paper investigates the simulation and verification of such a Zhang neural network (ZNN). Four important simulation techniques are employed to simulate this system: (1) Kronecker product of matrices is introduced to transform a matrix-differential-equation (MDE) to a vector differential equation (VDE) [i.e., finally, there is a standard ordinary-differential-equation (ODE) formulation]. (2) MATLAB routine “ode45” with a mass-matrix property is introduced to simulate the transformed initial-value implicit ODE system. (3) Matrix derivatives are obtained using the routine “diff” and symbolic math toolbox. (4) Various implementation errors and different types of activation functions are investigated, further demonstrating the advantages of the ZNN model. Three illustrative computer-simulation examples substantiate the theoretical results and efficacy of the ZNN model for online time-varying matrix inversion.  相似文献   

7.
A special class of recurrent neural networks (RNN) has recently been proposed by Zhang et al. for solving online time-varying matrix problems. Being different from conventional gradient-based neural networks (GNN), such RNN (termed specifically as Zhang neural networks, ZNN) are designed based on matrix-valued error functions, instead of scalar-valued norm-based energy functions. In this paper, we generalize and further investigate the ZNN model for time-varying matrix square root finding. For the purpose of possible hardware (e.g., digital circuit) realization, a discrete-time ZNN model is constructed and developed, which incorporates Newton iteration as a special case. Besides, to obtain an appropriate step-size value (in each iteration), a line-search algorithm is employed for the proposed discrete-time ZNN model. Computer-simulation results substantiate the effectiveness of the proposed ZNN model aided with a line-search algorithm, in addition to the connection and explanation to Newton iteration for matrix square root finding.  相似文献   

8.
A new kind of recurrent neural network is presented for solving the Lyapunov equation with time-varying coefficient matrices. Different from other neural-computation approaches, the neural network is developed by following Zhang et al.'s design method, which is capable of solving the time-varying Lyapunov equation. The resultant Zhang neural network (ZNN) with implicit dynamics could globally exponentially converge to the exact time-varying solution of such a Lyapunov equation. Computer-simulation results substantiate that the proposed recurrent neural network could achieve much superior performance on solving the Lyapunov equation with time-varying coefficient matrices, as compared to conventional gradient-based neural networks (GNN).  相似文献   

9.

In this paper, a finite-time convergent Zhang neural network (ZNN) is proposed and studied for matrix square root finding. Compared to the original ZNN (OZNN) model, the finite-time convergent ZNN (FTCZNN) model fully utilizes a nonlinearly activated sign-bi-power function, and thus possesses faster convergence ability. In addition, the upper bound of convergence time for the FTCZNN model is theoretically derived and estimated by solving differential inequalities. Simulative comparisons are further conducted between the OZNN model and the FTCZNN model under the same conditions. The results validate the effectiveness and superiority of the FTCZNN model for matrix square root finding.

  相似文献   

10.
Bartels–Stewart algorithm is an effective and widely used method with an O(n 3) time complexity for solving a static Sylvester equation. When applied to time-varying Sylvester equation, the computation burden increases intensively with the decrease of sampling period and cannot satisfy continuous realtime calculation requirements. Gradient-based recurrent neural network are able to solve the time-varying Sylvester equation in real time but there always exists an estimation error. In contrast, the recently proposed Zhang neural network has been proven to converge to the solution of the Sylvester equation ideally when time goes to infinity. However, this neural network with the suggested activation functions never converges to the desired value in finite time, which may limit its applications in realtime processing. To tackle this problem, a sign-bi-power activation function is proposed in this paper to accelerate Zhang neural network to finite-time convergence. The global convergence and finite-time convergence property are proven in theory. The upper bound of the convergence time is derived analytically. Simulations are performed to evaluate the performance of the neural network with the proposed activation function. In addition, the proposed strategy is applied to online calculating the pseudo-inverse of a matrix and nonlinear control of an inverted pendulum system. Both theoretical analysis and numerical simulations validate the effectiveness of proposed activation function.  相似文献   

11.
为了求解线性矩阵方程问题,应用一种基于负梯度法的递归神经网络模型,并探讨了该递归神经网络实时求解线性矩阵方程的全局指数收敛问题.在讨论渐近收敛性基础上,进一步证明了该类神经网络在系数矩阵满足有解条件的情况下具有全局指数收敛性,在不能满足有解条件的情况下具有全局稳定性.计算机仿真结果证实了相关理论分析和该网络实时求解线性矩阵方程的有效性.  相似文献   

12.
Neural Processing Letters - Several improvements of the Zhang neural network (ZNN) dynamics for solving the time-varying matrix inversion problem are presented. Introduced ZNN dynamical design is...  相似文献   

13.
Different from gradient-based neural dynamics, a special kind of recurrent neural dynamics has recently been proposed by Zhang et al. for solving online time-varying problems. Such a recurrent neural dynamics is designed based on an indefinite error-monitoring function instead of a usually norm- or square-based energy function. In addition, Zhang neural dynamics (ZND) are depicted generally in implicit dynamics, whereas gradient-based neural dynamics (GND) are associated with explicit dynamics. In this paper, we generalize the ZND design method to solving online nonlinear time-varying equations in the form of f (x, t) = 0. For comparative purposes, the GND model is also employed for such time-varying equations’ solving. Computer-simulation results via power-sigmoid activation functions substantiate the theoretical analysis and efficacy of the ZND model for solving online nonlinear time-varying equations.  相似文献   

14.
In this paper, the performance of a gradient neural network (GNN), which was designed intrinsically for solving static problems, is investigated, analyzed and simulated in the situation of time-varying coefficients. It is theoretically proved that the gradient neural network for online solution of time-varying quadratic minimization (QM) and quadratic programming (QP) problems could only approximately approach the time-varying theoretical solution, instead of converging exactly. That is, the steady-state error between the GNN solution and the theoretical solution can not decrease to zero. In order to understand the situation better, the upper bound of such an error is estimated firstly, and then the global exponential convergence rate is investigated for such a GNN when approaching an error bound. Computer-simulation results, including those based on a six-link robot manipulator, further substantiate the performance analysis of the GNN exploited to solve online time-varying QM and QP problems.  相似文献   

15.
In this paper, two novel neural networks (NNNs), namely NNN‐L and NNN‐R neural models, are proposed to online left and right Moore‐Penrose inversion. As compared to GNN (gradient neural network) and the recently proposed ZNN (Zhang neural network) for the left or right Moore‐Penrose inverse solving, our models are theoretically proven to possess superior global convergence performance. More importantly, the proposed NNN‐R model is successfully applied to path‐tracking control of a three‐link planar robot manipulator. Illustrative examples well validate the theoretical analyses as well as demonstrate the feasibility of the proposed models, which are adopted and verified their effectiveness in kinematic control of a redundant manipulator, for real‐time Moore‐Penrose inverse solving.  相似文献   

16.
Different from conventional gradient-based neural dynamics, a special class of neural dynamics have been proposed by Zhang et al. since 12 March 2001 for online solution of time-varying and static (or termed, time-invariant) problems (e.g., nonlinear equations). The design of Zhang dynamics (ZD) is based on the elimination of an indefinite error-function, instead of the elimination of a square-based positive or at least lower-bounded energy-function usually associated with gradient dynamics (GD) and/or Hopfield-type neural networks. In this paper, we generalize, develop, investigate and compare the continuous-time ZD (CTZD) and GD models for online solution of time-varying and static square roots. In addition, a simplified continuous-time ZD (S-CTZD) and discrete-time ZD (DTZD) models are generated for static scalar-valued square roots finding. In terms of such scalar square roots finding problem, the Newton iteration (also termed, Newton-Raphson iteration) is found to be a special case of the DTZD models (by focusing on the static-problem solving, utilizing the linear activation function and fixing the step-size to be 1). Computer-simulation results via a power-sigmoid activation function further demonstrate the efficacy of the ZD solvers for online scalar (time-varying and static) square roots finding, in addition to the DTZD's link and new explanation to Newton-Raphson iteration.  相似文献   

17.
Neural Processing Letters - In the previous work, a typical recurrent neural network termed Zhang neural network (ZNN) has been developed for various time-varying problems solving. Based on the...  相似文献   

18.
This technical note presents theoretical analysis and simulation results on the performance of a classic gradient neural network (GNN), which was designed originally for constant matrix inversion but is now exploited for time-varying matrix inversion. Compared to the constant matrix-inversion case, the gradient neural network inverting a time-varying matrix could only approximately approach its time-varying theoretical inverse, instead of converging exactly. In other words, the steady-state error between the GNN solution and the theoretical/exact inverse does not vanish to zero. In this technical note, the upper bound of such an error is estimated firstly. The global exponential convergence rate is then analyzed for such a Hopfield-type neural network when approaching the bound error. Computer-simulation results finally substantiate the performance analysis of this gradient neural network exploited to invert online time-varying matrices.  相似文献   

19.
Following the idea of using first-order time derivatives, this paper presents a general recurrent neural network (RNN) model for online inversion of time-varying matrices. Different kinds of activation functions are investigated to guarantee the global exponential convergence of the neural model to the exact inverse of a given time-varying matrix. The robustness of the proposed neural model is also studied with respect to different activation functions and various implementation errors. Simulation results, including the application to kinematic control of redundant manipulators, substantiate the theoretical analysis and demonstrate the efficacy of the neural model on time-varying matrix inversion, especially when using a power-sigmoid activation function.  相似文献   

20.
Different from conventional gradient-based neural dynamics, a special type of neural dynamics has been proposed by Zhang et al. for online solution of time-varying and/or static (or termed, time-invariant) problems. The design of Zhang dynamics (ZD) is based on the elimination of an indefinite error function, instead of the elimination of a square-based positive (or at least lower-bounded) energy function usually associated with gradient dynamics (GD). In this paper, we generalize, propose and investigate the continuous-time ZD model and its discrete-time models in two situations (i.e., the time-derivative of the coefficient being known or unknown) for time-varying cube root finding, including the complex-valued continuous-time ZD model for finding cube roots in complex domain. In addition, to find the static scalar-valued cube root, a simplified continuous-time ZD model and its discrete-time model are generated. By focusing on such a static problem solving, Newton-Raphson iteration is found to be a special case of the discrete-time ZD model by utilizing the linear activation function and fixing the step-size value to be 1. Computer-simulation and testing results demonstrate the efficacy of the proposed ZD models (including real-valued ZD models and complex-valued ZD models) for time-varying and static cube root finding, as well as the link and new explanation to Newton-Raphson iteration.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号