排序方式: 共有4条查询结果,搜索用时 15 毫秒
1
1.
2.
3.
In this paper we discuss various bounds on the Bayesian probability of error, which are used for feature selection, and are based on distance measures and information measures. We show that they are basically of two types. One type can be related to the f-divergence, the other can be related to information measures. This also clarifies some properties of these measures for the two-class problem and for the multiclass problem. We give some general bounds on the Bayesian probability of error and discuss various aspects of the different approaches. 相似文献
4.
V. MAJERNÍK 《国际通用系统杂志》2013,42(3):201-219
We propose the f-divergences of two probability distributions as the measures of the organization of a probabilistic system with respect to its probabilistic uncertainty. A probabilistic system consist of stochastical objects on which random variables are defined which are statistically dependent. Using Shannon's f-divergence for the organization of a probabilistic system we express it in terms of the probability distributions of the element random variables and their statistical linkages. Then we find the maximum entropy of a probabilistic system if the statistical linkages between its elements are given as input data. We show that an important class of physical statistical systems can be described by the probabilistic systems of Gibbsian type. 相似文献
1