首页 | 本学科首页   官方微博 | 高级检索  
     


GENERALISED TRANSFER FUNCTIONS OF NEURAL NETWORKS
Authors:C.F. Fung  S.A. Billings  H. Zhang
Affiliation:aCentre for Signal Processing, S2-B4-O8, School of Electrical and Electronic Engineering, Nanyang Technological University, Nanyang Avenue, Singapore 639798;bDepartment of Automatic Control and Systems Engineering, University of Sheffield, Mappin Street, Sheffield, S1 3JD, U.K.;cPredictive Control Limited, Richmond House, Gadbrook Business Centre, Northwich, Cheshire CW9 7TN, U.K.
Abstract:When artificial neural networks are used to model non-linear dynamical systems, the system structure which can be extremely useful for analysis and design, is buried within the network architecture. In this paper, explicit expressions for the frequency response or generalised transfer functions of both feedforward and recurrent neural networks are derived in terms of the network weights. The derivation of the algorithm is established on the basis of the Taylor series expansion of the activation functions used in a particular neural network. This leads to a representation which is equivalent to the non-linear recursive polynomial model and enables the derivation of the transfer functions to be based on the harmonic expansion method. By mapping the neural network into the frequency domain information about the structure of the underlying non-linear system can be recovered. Numerical examples are included to demonstrate the application of the new algorithm. These examples show that the frequency response functions appear to be highly sensitive to the network topology and training, and that the time domain properties fail to reveal deficiencies in the trained network structure.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号