首页 | 本学科首页   官方微博 | 高级检索  
     


Algorithmic complexity bounds on future prediction errors
Affiliation:1. IDSIA, Galleria 2, CH-6928 Manno-Lugano, Switzerland;2. TU Munich, Boltzmannstr. 3, 85748 Garching, München, Germany;3. RSISE/ANU/NICTA, Canberra, ACT 0200, Australia;4. LIF, CMI, 39 rue Joliot Curie, 13453 Marseille cedex 13, France
Abstract:We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ. Here we assume that we are at a time t > 1 and have already observed x = x1  xt. We bound the future prediction performance on xt+1xt+2 ⋯ by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号