The Power of Self-Directed Learning |
| |
Authors: | Goldman Sally A Sloan Robert H |
| |
Affiliation: | (1) Department of Computer Science, Washington University, St. Louis, MO, 63130;(2) Department of Electrical Engineering and Computer Science, University of Illinois at Chicago, Chicago, IL, 60680 |
| |
Abstract: | This article studies self-directed learning, a variant of the on-line (or incremental) learning model in which the learner selects the presentation order for the instances. Alternatively, one can view this model as a variation of learning with membership queries in which the learner is only charged for membership queries for which it could not predict the outcome. We give tight bounds on the complexity of self-directed learning for the concept classes of monomials, monotone DNF formulas, and axis-parallel rectangles in {0, 1,
, n – 1}
d
. These results demonstrate that the number of mistakes under self-directed learning can be surprisingly small. We then show that learning complexity in the model of self-directed learning is less than that of all other commonly studied on-line and query learning models. Next we explore the relationship between the complexity of self-directed learning and the Vapnik-Chervonenkis (VC-)dimension. We show that, in general, the VC-dimension and the self-directed learning complexity are incomparable. However, for some special cases, we show that the VC-dimension gives a lower bound for the self-directed learning complexity. Finally, we explore a relationship between Mitchell's version space algorithm and the existence of self-directed learning algorithms that make few mistakes. |
| |
Keywords: | computational learning theory on-line learning incremental learning self-directed learning |
本文献已被 SpringerLink 等数据库收录! |
|