首页 | 本学科首页   官方微博 | 高级检索  
     


On the Learnability of Recursive Data
Authors:Barbara Hammer
Affiliation:(1) Department of Mathematics/Computer Science, University of Osnabrück, Albrechtstrasse 28, D-49069 Osnabrück, Germany. hammer@brahms.informatik.uni-osnabrueck.de., DE
Abstract:We establish some general results concerning PAC learning: We find a characterization of the property that any consistent algorithm is PAC. It is shown that the shrinking width property is equivalent to PUAC learnability. By counterexample, PAC and PUAC learning are shown to be different concepts. We find conditions ensuring that any nearly consistent algorithm is PAC or PUAC, respectively.?The VC dimension of recurrent neural networks and folding networks is infinite. For restricted inputs, however, bounds exist. The bounds for restricted inputs are transferred to folding networks.?We find conditions on the probability of the input space ensuring polynomial learnability: the probability of sequences or trees has to converge to zero sufficiently fast with increasing length or height.?Finally, we find an example for a concept class that requires exponentially growing sample sizes for accurate generalization. Date received: September 5, 1997. Date revised: May 29, 1998.
Keywords:. Recurrent neural networks   Folding networks   Computational learning theory   PAC learning   VC dimension.
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号