首页 | 本学科首页   官方微博 | 高级检索  
     


Recurrent neural networks for solving second-order cone programs
Authors:Chun-Hsu KoAuthor Vitae  Ching-Yu YangAuthor Vitae
Affiliation:a Department of Electrical Engineering, I-Shou University, Kaohsiung County 840, Taiwan
b Department of Mathematics, National Taiwan Normal University, Taipei 11677, Taiwan
Abstract:This paper proposes using the neural networks to efficiently solve the second-order cone programs (SOCP). To establish the neural networks, the SOCP is first reformulated as a second-order cone complementarity problem (SOCCP) with the Karush-Kuhn-Tucker conditions of the SOCP. The SOCCP functions, which transform the SOCCP into a set of nonlinear equations, are then utilized to design the neural networks. We propose two kinds of neural networks with the different SOCCP functions. The first neural network uses the Fischer-Burmeister function to achieve an unconstrained minimization with a merit function. We show that the merit function is a Lyapunov function and this neural network is asymptotically stable. The second neural network utilizes the natural residual function with the cone projection function to achieve low computation complexity. It is shown to be Lyapunov stable and converges globally to an optimal solution under some condition. The SOCP simulation results demonstrate the effectiveness of the proposed neural networks.
Keywords:SOCP  Neural network  Merit function  Fischer-Burmeister function  Cone projection function  Lyapunov stable
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号