首页 | 本学科首页   官方微博 | 高级检索  
     


Lower Bound Methods and Separation Results for On-Line Learning Models
Authors:Maass  Wolfgang  Turán  György
Affiliation:(1) Institute for Theoretical Computer Science, Technische Universität Graz, Klosterwiesgasse 32, A-8010 Graz, Austria;(2) University of Illinois at Chicago, USA;(3) Department of Mathematics, Statistics and Computer Science, University of Illinois at Chicago, IL;(4) Automata Theory Research Group of the Hungarian Academy of Sciences, Szeged, Hungary
Abstract:We consider the complexity of concept learning in various common models for on-line learning, focusing on methods for proving lower bounds to the learning complexity of a concept class. Among others, we consider the model for learning with equivalence and membership queries. For this model we give lower bounds on the number of queries that are needed to learn a concept class 
$$mathcal{C}$$
in terms of the Vapnik-Chervonenkis dimension of 
$$mathcal{C}$$
, and in terms of the complexity of learning 
$$mathcal{C}$$
with arbitrary equivalence queries. Furthermore, we survey other known lower bound methods and we exhibit all known relationships between learning complexities in the models considered and some relevant combinatorial parameters. As it turns out, the picture is almost complete. This paper has been written so that it can be read without previous knowledge of Computational Learning Theory.
Keywords:Formal models for learning  learning algorithms  lower bound arguments  VC-dimension  machine learning
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号