首页 | 本学科首页   官方微博 | 高级检索  
     


The essential order of approximation for neural networks
Authors:Xu?Zongben?  author-information"  >  author-information__contact u-icon-before"  >  mailto:zbxu@mail.xjtu.edu.cn"   title="  zbxu@mail.xjtu.edu.cn"   itemprop="  email"   data-track="  click"   data-track-action="  Email author"   data-track-label="  "  >Email author,Cao?Feilong
Affiliation:Institute for Information and System Sciences, Xi'an Jiaotong University, Xi'an 710049, China
Abstract:There have been various studies on approximation ability of feedforward neural networks (FNNs). Most of the existing studies are, however, only concerned with density or upper bound estimation on how a multivariate function can be approximated by an FNN, and consequently, the essential approximation ability of an FNN cannot be revealed. In this paper, by establishing both upper and lower bound estimations on approximation order, the essential approximation ability (namely, the essential approximation order) of a class of FNNs is clarified in terms of the modulus of smoothness of functions to be approximated. The involved FNNs can not only approximate any continuous or integrable functions defined on a compact set arbitrarily well, but also provide an explicit lower bound on the number of hidden units required. By making use of multivariate approximation tools, it is shown that when the functions to be approximated are Lipschitzian with order up to 2, the approximation speed of the FNNs is uniquely determined by modulus of smoothness of the functions.
Keywords:feedforward neural networks   approximation order   the modulus of smoonthness of a multivariate function.
本文献已被 CNKI 万方数据 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号