首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7篇
  免费   0篇
冶金工业   1篇
自动化技术   6篇
  2003年   2篇
  2002年   1篇
  2001年   2篇
  1999年   1篇
  1975年   1篇
排序方式: 共有7条查询结果,搜索用时 46 毫秒
1
1.
The problem of existence of predictive complexity for the absolute loss game is studied. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. For perfectly mixable loss functions (logarithmic and squared difference are among them) predictive complexity is defined like Kolmogorov complexity to within an additive constant. The absolute loss function is not perfectly mixable, and the question of existence of the corresponding predictive complexity, which is defined to within an additive constant, is open. We prove that in the case of the absolute loss game the predictive complexity can be defined to within an additive term O( ), where n is the length of a sequence of outcomes. We prove also that in some restricted settings this bound cannot be improved.  相似文献   
2.
3.
Conditions are presented under which the maximum of the Kolmogorov complexity (algorithmic entropy) K(1... N ) is attained, given the cost f( i ) of a message 1... N . Various extremal relations between the message cost and the Kolmogorov complexity are also considered; in particular, the minimization problem for the function f( i ) – K(1... N ) is studied. Here, is a parameter, called the temperature by analogy with thermodynamics. We also study domains of small variation of this function.  相似文献   
4.
The central problem in machine learning (and statistics) is the problem of predicting future events xn+1 based on past observations x1x2xn, where n=1, 2…. The main goal is to find a method of prediction that minimizes the total loss suffered on a sequence x1x2xn+1 for n=1, 2…. We say that a data sequence is stochastic if there exists a simply described prediction algorithm whose performance is close to the best possible one. This optimal performance is defined in terms of Vovk's predictive complexity, which is a generalization of the notion of Kolmogorov complexity. Predictive complexity gives a limit on the predictive performance of simply described prediction algorithms. In this paper we argue that data sequences normally occurring in the real world are stochastic; more formally, we prove that Levin's a priori semimeasure of nonstochastic sequences is small.  相似文献   
5.
Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences   总被引:1,自引:0,他引:1  
V'yugin  V. V. 《Computer Journal》1999,42(4):294-317
  相似文献   
6.
Main laws of probability theory, when applied to individual sequences, have a robustness property under small violations of randomness. For example, the law of large numbers for the symmetric Bernoulli scheme holds for a sequence where the randomness deficiency of its initial fragment of length n grows as o(n). The law of iterated logarithm holds if the randomness deficiency grows as o(loglogn). We prove that Birkhoff's individual ergodic theorem is nonrobust in this sense. If the randomness deficiency grows arbitrarily slowly on initial fragments of an infinite sequence, this theorem can be violated. An analogous nonrobustness property holds for the Shannon–McMillan–Breiman theorem.  相似文献   
7.
The Lempel–Ziv universal coding scheme is asymptotically optimal for the class of all stationary ergodic sources. A problem of robustness of this property under small violations of ergodicity is studied. The notion of deficiency of algorithmic randomness is used as a measure of disagreement between data sequence and probability measure. We prove that universal compression schemes from a large class are nonrobust in the following sense: If the randomness deficiency grows arbitrarily slowly on initial fragments of an infinite sequence, then the property of asymptotic optimality of any universal compression algorithm can be violated. Lempel–Ziv compression algorithms are robust on infinite sequences generated by ergodic Markov chains when the randomness deficiency of their initial fragments of length n grows as o(n).  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号