首页 | 本学科首页   官方微博 | 高级检索  
     


Improved neural solution for the Lyapunov matrix equation based on gradient search
Authors:Yuhuan Chen  Chenfu Yi  Dengyu Qiao
Affiliation:1. Center for Educational Technology, Gannan Normal University, Ganzhou 341000, China;2. Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences, Shenzhen 518055, China;3. School of Information Engineering, Jiangxi University of Science and Technology, Ganzhou 341000, China
Abstract:By using the hierarchical identification principle, based on the conventional gradient search, two neural subsystems are developed and investigated for the online solution of the well-known Lyapunov matrix equation. Theoretical analysis shows that, by using any monotonically-increasing odd activation function, the gradient-based neural networks (GNN) can solve the Lyapunov equation exactly and efficiently. Computer simulation results confirm that the solution of the presented GNN models could globally converge to the solution of the Lyapunov matrix equation. Moreover, when using the power-sigmoid activation functions, the GNN models have superior convergence when compared to linear models.
Keywords:Analysis of algorithms  Recurrent neural networks  Gradient search  Hierarchical identification principle  Energy function  Activation function
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号