首页 | 本学科首页   官方微博 | 高级检索  
     


Local recurrent sigmoidal-wavelet neurons in feed-forward neural network for forecasting of dynamic systems: Theory
Authors:Ahmad Banakar  Mohammad Fazle Azeem
Affiliation:a Department of Agricultural Machinery, Faculty of Agriculture, University of Tarbiat Modares, Tehran, Iran
b Department of Electrical Engineering, Aligarh Muslim University, Aligarh, India
Abstract:In this paper different structure of the neurons in the hidden layer of a feed-forward network, for forecasting of the dynamic systems, are proposed. Each neuron in the network is a combination of the sigmoidal activation function (SAF) and wavelet activation function (WAF). The output of the hidden neuron is the product of the output from these two activation functions. A delay element is used to feedback the output of the sigmoidal and the wavelet activation function to each other. This arrangement leads to proposed five possible configurations of recurrent neurons. Besides proposing these neuron models, the presented paper tries to compare the performance of wavelet function with sigmoid function. To guarantee the stability and the convergence of the learning process, upper bound for the learning rates has been investigated using the Lyapunov stability theorem. A two-phase adaptive learning rate ensures this upper bound. Universal approximation property of the feed-forward network with the proposed neurons has also been investigated. Finally, the applicability and comparison of the proposed recurrent networks has been weathered on two benchmark problem catering different types of dynamical systems.
Keywords:Neural network  Wavelet network  Recurrent  Lyapunov stability  Universal approximation
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号