首页 | 本学科首页   官方微博 | 高级检索  
     


Predictive complexity and information
Authors:Michael V. Vyugin
Affiliation:a Department of Computer Science, Royal Holloway, University of London, Egham, Surrey TW20 0EX, UK
b Institute for Information Transmission Problems, Russian Academy of Sciences, Bol'shoi Karetnyi per. 19, Moscow GSP-4, 101447, Russia
Abstract:The notions of predictive complexity and of corresponding amount of information are considered. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. We consider predictive complexity for a wide class of bounded loss functions which are generalizations of square-loss function. Relations between unconditional KG(x) and conditional KG(x|y) predictive complexities are studied. We define an algorithm which has some “expanding property”. It transforms with positive probability sequences of given predictive complexity into sequences of essentially bigger predictive complexity. A concept of amount of predictive information IG(y:x) is studied. We show that this information is noncommutative in a very strong sense and present asymptotic relations between values IG(y:x), IG(x:y), KG(x) and KG(y).
Keywords:Machine learning   Algorithmic prediction   Predictive complexity   Predictive information   Loss functions   Kolmogorov complexity   Expanding property
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号