Strong Minimax Lower Bounds for Learning |
| |
Authors: | Antos András Lugosi Gábor |
| |
Affiliation: | (1) Department of Mathematics and Computer Science, Faculty of Electrical Engineering, Technical University of Budapest, 1521 Stoczek u.2, Budapest, Hungary. E-mail;(2) Department of Economics, Pompeu Fabra University, Ramon Trias Fargas, 25-27, 08005 Barcelona, Spain. E-mail |
| |
Abstract: | ![]() Minimax lower bounds for concept learning state, for example, that for each sample size n and learning rule gn , there exists a distribution of the observation X and a concept C to be learnt such that the expected error of gn is at least a constant times V/n, where V is the vc dimension of the concept class. However, these bounds do not tell anything about the rate of decrease of the error for a fixed distribution-concept pair.In this paper we investigate minimax lower bounds in such a (stronger) sense. We show that for several natural k-parameter concept classes, including the class of linear halfspaces, the class of balls, the class of polyhedra with a certain number of faces, and a class of neural networks, for any sequence of learning rules {gn}, there exists a fixed distribution of X and a fixed conceptC such that the expected error is larger than a constant timesk/n for infinitely manyn. We also obtain such strong minimax lower bounds for the tail distribution of the probability of error, which extend the corresponding minimax lower bounds. |
| |
Keywords: | Vapnik-Chervonenkis dimension PAC learning lower bounds |
本文献已被 SpringerLink 等数据库收录! |
|