首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper applies the transferable belief model (TBM) interpretation of the Dempster-Shafer theory of evidence to approximate distribution of circuit performance function for parametric yield estimation. Treating input parameters of performance function as credal variables defined on a continuous frame of real numbers, the suggested approach constructs a random set-type evidence for these parameters. The corresponding random set of the function output is obtained by extension principle of random set. Within the TBM framework, the random set of the function output in the credal state can be transformed to a pignistic state where it is represented by the pignistic cumulative distribution. As an approximation to the actual cumulative distribution, it can be used to estimate yield according to circuit response specifications. The advantage of the proposed method over Monte Carlo (MC) methods lies in its ability to implement just once simulation process to obtain an available approximate value of yield which has a deterministic estimation error. Given the same error, the new method needs less number of calculations than MC methods. A track circuit of high-speed railway and a numerical eight-dimensional quadratic function examples are included to demonstrate the efficiency of this technique.  相似文献   

2.
We present a decision support system which is based on the transferable belief model (TBM), a model to represent someone's degree of belief based on belief functions. The system performs reasoning and decision making by integrating a system for belief propagation and a system for Bayesian decision analysis. The two subsystems are developed within the framework of the valuation-based systems. They are connected through the pignistic transformation as described in the context of the TBM. The system takes as inputs the user's beliefs and utilities, and suggests either the optimal decision or the optimal sequence of decisions. An example concerning a nuclear waste disposal problem is given to demonstrate the applicability of the system in a real-world domain  相似文献   

3.
ABSTRACT

The main contribution of this paper is a new definition of expected value of belief functions in the Dempster–Shafer (D–S) theory of evidence. Our definition shares many of the properties of the expectation operator in probability theory. Also, for Bayesian belief functions, our definition provides the same expected value as the probabilistic expectation operator. A traditional method of computing expected of real-valued functions is to first transform a D–S belief function to a corresponding probability mass function, and then use the expectation operator for probability mass functions. Transforming a belief function to a probability function involves loss of information. Our expectation operator works directly with D–S belief functions. Another definition is using Choquet integration, which assumes belief functions are credal sets, i.e. convex sets of probability mass functions. Credal sets semantics are incompatible with Dempster's combination rule, the center-piece of the D–S theory. In general, our definition provides different expected values than, e.g. if we use probabilistic expectation using the pignistic transform or the plausibility transform of a belief function. Using our definition of expectation, we provide new definitions of variance, covariance, correlation, and other higher moments and describe their properties.  相似文献   

4.
The problem tackled in this article consists in associating perceived objects detected at a certain time with known objects previously detected, knowing uncertain and imprecise information regarding the association of each perceived objects with each known objects. For instance, this problem can occur during the association step of an obstacle tracking process, especially in the context of vehicle driving aid. A contribution in the modeling of this association problem in the belief function framework is introduced. By interpreting belief functions as weighted opinions according to the Transferable Belief Model semantics, pieces of information regarding the association of known objects and perceived objects can be expressed in a common global space of association to be combined by the conjunctive rule of combination, and a decision making process using the pignistic transformation can be made. This approach is validated on real data.  相似文献   

5.
One of the issues in diagnostic reasoning is inferring about the location of a fault in cases where process data carry inconsistent or even conflicting evidence. This problem is treated in a systematic way by making use of the transferable belief model (TBM), which represents an approximate reasoning scheme derived from the Dempster–Shafer theory of evidence. The key novelty of TBM concerns the paradigm of the open world, which turns out to lead to a new means of assigning beliefs to anticipated fault candidates. Thus, instead of being ignored, inconsistency of data is displayed in a portion of belief that cannot be allocated to any of the suspected faults but rather to an unknown origin. This item of belief is referred to as the strength of conflict (SC). It is shown in this paper that SC can be interpreted as a degree of confidence in the diagnostic results, which seems to bring a new feature to diagnostic practice. The basics of TBM are reviewed in the paper and the implementation of the underlying ideas in the diagnostic reasoning context is presented. An important contribution concerns the extension of basic TBM reasoning from single observations to a batch of observations by employing the idea of discounting of evidence. The application of TBM to fault isolation in a gas–liquid separation process clearly shows that extended TBM significantly improves the performance of the diagnostic system compared to ordinary TBM as well as classical Boolean framework, especially as regards diagnostic stability and reliability.  相似文献   

6.
The compact representation of incomplete probabilistic knowledge which can be encountered in risk evaluation problems, for instance in environmental studies is considered. Various kinds of knowledge are considered such as expert opinions about characteristics of distributions or poor statistical information. The approach is based on probability families encoded by possibility distributions and belief functions. In each case, a technique for representing the available imprecise probabilistic information faithfully is proposed, using different uncertainty frameworks, such as possibility theory, probability theory, and belief functions, etc. Moreover the use of probability-possibility transformations enables confidence intervals to be encompassed by cuts of possibility distributions, thus making the representation stronger. The respective appropriateness of pairs of cumulative distributions, continuous possibility distributions or discrete random sets for representing information about the mean value, the mode, the median and other fractiles of ill-known probability distributions is discussed in detail.  相似文献   

7.
针对训练模式类标签不精确的识别问题,提出基于可传递信度模型的自适应模糊k-NN(k-Nearest Neighbor)分类器。利用可传递信度模型结合模糊集理论和可能性理论并运用pignistic变换,对待识别模式真正所属的类做出决策。采用梯度下降最小化误差函数,以实现参数的自适应学习。实验结果表明,该分类器误分类率低、鲁棒性强。  相似文献   

8.
This paper presents a method for assessing the reliability of a sensor in a classification problem based on the transferable belief model. First, we develop a method for the evaluation of the reliability of a sensor when considered alone. The method is based on finding the discounting factor minimizing the distance between the pignistic probabilities computed from the discounted beliefs and the actual values of data. Next, we develop a method for assessing the reliability of several sensors that are supposed to work jointly and their readings are aggregated. The discounting factors are computed on the basis of minimizing the distance between the pignistic probabilities computed from the combined discounted belief functions and the actual values of data.  相似文献   

9.
This paper explains how multisensor data fusion and target identification can be performed within the transferable belief model (TBM), a model for the representation of quantified uncertainty based on belief functions. We present the underlying theory, in particular the general Bayesian theorem needed to transform likelihoods into beliefs and the pignistic transformation needed to build the probability measure required for decision making. We present how this method applies in practice. We compare its solution with the classical one, illustrating it with an embarrassing example, where the TBM and the probability solutions completely disagree. Computational efficiency of the belief-function solution was supposedly proved in a study that we reproduce and we show that in fact the opposite conclusions hold. The results presented here can be extended directly to many problems of data fusion and diagnosis.  相似文献   

10.
Constructing and analyzing large biological pathway models is a significant challenge. We propose a general approach that exploits the structure of a pathway to identify pathway components, constructs the component models, and finally assembles the component models into a global pathway model. Specifically, we apply this approach to pathway parameter estimation, a main step in pathway model construction. A large biological pathway often involves many unknown parameters and the resulting high-dimensional search space poses a major computational difficulty. By exploiting the structure of a pathway and the distribution of available experimental data over the pathway, we decompose a pathway into components and perform parameter estimation for each component. However, some parameters may belong to multiple components. Independent parameter estimates from different components may be in conflict for such parameters. To reconcile these conflicts, we represent each component as a factor graph, a standard probabilistic graphical model. We then combine the resulting factor graphs and use a probabilistic inference technique called belief propagation to obtain the maximally likely parameter values that are globally consistent. We validate our approach on a synthetic pathway model based on the Akt-MAPK signaling pathways. The results indicate that the approach can potentially scale up to large pathway models.  相似文献   

11.
In this paper we present a new credal classification rule (CCR) based on belief functions to deal with the uncertain data. CCR allows the objects to belong (with different masses of belief) not only to the specific classes, but also to the sets of classes called meta-classes which correspond to the disjunction of several specific classes. Each specific class is characterized by a class center (i.e. prototype), and consists of all the objects that are sufficiently close to the center. The belief of the assignment of a given object to classify with a specific class is determined from the Mahalanobis distance between the object and the center of the corresponding class. The meta-classes are used to capture the imprecision in the classification of the objects when they are difficult to correctly classify because of the poor quality of available attributes. The selection of meta-classes depends on the application and the context, and a measure of the degree of indistinguishability between classes is introduced. In this new CCR approach, the objects assigned to a meta-class should be close to the center of this meta-class having similar distances to all the involved specific classes? centers, and the objects too far from the others will be considered as outliers (noise). CCR provides robust credal classification results with a relatively low computational burden. Several experiments using both artificial and real data sets are presented at the end of this paper to evaluate and compare the performances of this CCR method with respect to other classification methods.  相似文献   

12.
The sources of evidence may have different reliability and importance in real applications for decision making. The estimation of the discounting (weighting) factors when the prior knowledge is unknown have been regularly studied until recently. In the past, the determination of the weighting factors focused only on reliability discounting rule and it was mainly dependent on the dissimilarity measure between basic belief assignments (bba's) represented by an evidential distance. Nevertheless, it is very difficult to characterize efficiently the dissimilarity only through an evidential distance. Thus, both a distance and a conflict coefficient based on probabilistic transformations BetP are proposed to characterize the dissimilarity. The distance represents the difference between bba's, whereas the conflict coefficient reveals the divergence degree of the hypotheses that two belief functions strongly support. These two aspects of dissimilarity are complementary in a certain sense, and their fusion is used as the dissimilarity measure. Then, a new estimation method of weighting factors is presented by using the proposed dissimilarity measure. In the evaluation of weight of a source, both its dissimilarity with other sources and their weighting factors are considered. The weighting factors can be applied in the both importance and reliability discounting rules, but the selection of the adapted discounting rule should depend on the actual application. Simple numerical examples are given to illustrate the interest of the proposed approach.  相似文献   

13.
We present an interpretation of belief functions within a pure probabilistic framework, namely as normalized self-conditional expected probabilities, and study their mathematical properties. Interpretations of belief functions appeal to partial knowledge. The self-conditional interpretation does this within the traditional probabilistic framework by considering surplus belief in an event emerging from a future observation, conditional on the event occurring. Dempster's original interpretation, in contrast, involves partial knowledge of a belief state. The modal interpretation, currently gaining popularity, models the probability of a proposition being believed (or proved, or known). The versatility of the belief function formalism is demonstrated by the fact that it accommodates very different intuitions.  相似文献   

14.
The well-known Fuzzy C-Means (FCM) algorithm for data clustering has been extended to Evidential C-Means (ECM) algorithm in order to work in the belief functions framework with credal partitions of the data. Depending on data clustering problems, some barycenters of clusters given by ECM can become very close to each other in some cases, and this can cause serious troubles in the performance of ECM for the data clustering. To circumvent this problem, we introduce the notion of imprecise cluster in this paper. The principle of our approach is to consider that objects lying in the middle of specific classes (clusters) barycenters must be committed with equal belief to each specific cluster instead of belonging to an imprecise meta-cluster as done classically in ECM algorithm. Outliers object far away of the centers of two (or more) specific clusters that are hard to be distinguished, will be committed to the imprecise cluster (a disjunctive meta-cluster) composed by these specific clusters. The new Belief C-Means (BCM) algorithm proposed in this paper follows this very simple principle. In BCM, the mass of belief of specific cluster for each object is computed according to distance between object and the center of the cluster it may belong to. The distances between object and centers of the specific clusters and the distances among these centers will be both taken into account in the determination of the mass of belief of the meta-cluster. We do not use the barycenter of the meta-cluster in BCM algorithm contrariwise to what is done with ECM. In this paper we also present several examples to illustrate the interest of BCM, and to show its main differences with respect to clustering techniques based on FCM and ECM.  相似文献   

15.
针对训练模式类标签不精确的识别问题,提出了基于可传递信度模型(TBM)的自适应k-NN分类器,它通过运用pignistic变换,可以方便地对待识别模式真正所属的类做出决策,并通过梯度下降来最小化训练模式的输出类标签与目标类标签之间的误差函数,以实现参数的自适应学习.实验表明,该分类器用于处理训练模式类标签不精确的模式识别问题是有效的,且与参数优化前的基于TBM的k-NN分类器相比,其误分类率更低、鲁棒性更强.  相似文献   

16.
《Information Fusion》2007,8(1):16-27
The paper develops an approach to joint tracking and classification based on belief functions as understood in the transferable belief model (TBM). The TBM model is identical to the classical model except all probability functions are replaced by belief functions, which are more flexible for representing uncertainty. It is felt that the tracking phase is well handled by the classical Kalman filter but that the classification phase deserves amelioration. For the tracking phase, we derive a minimal set of assumptions needed in the TBM approach in order to recover the classical relations. For the classification phase, we distinguish between the observed target behaviors and the underlying target classes which are usually not in one-to-one correspondence. We feel the results obtained with the TBM approach are more reasonable than those obtained with the corresponding Bayesian classifiers.  相似文献   

17.
A belief classification rule for imprecise data   总被引:1,自引:1,他引:0  
The classification of imprecise data is a difficult task in general because the different classes can partially overlap. Moreover, the available attributes used for the classification are often insufficient to make a precise discrimination of the objects in the overlapping zones. A credal partition (classification) based on belief functions has already been proposed in the literature for data clustering. It allows the objects to belong (with different masses of belief) not only to the specific classes, but also to the sets of classes called meta-classes which correspond to the disjunction of several specific classes. In this paper, we propose a new belief classification rule (BCR) for the credal classification of uncertain and imprecise data. This new BCR approach reduces the misclassification errors of the objects difficult to classify by the conventional methods thanks to the introduction of the meta-classes. The objects too far from the others are considered as outliers. The basic belief assignment (bba) of an object is computed from the Mahalanobis distance between the object and the center of each specific class. The credal classification of the object is finally obtained by the combination of these bba’s associated with the different classes. This approach offers a relatively low computational burden. Several experiments using both artificial and real data sets are presented at the end of this paper to evaluate and compare the performances of this BCR method with respect to other classification methods.  相似文献   

18.
Automatic scene understanding from multimodal data is a key task in the design of fully autonomous vehicles. The theory of belief functions has proved effective for fusing information from several sensors at the superpixel level. Here, we propose a novel framework, called evidential grammars, which extends stochastic grammars by replacing probabilities by belief functions. This framework allows us to fuse local information with prior and contextual information, also modeled as belief functions. The use of belief functions in a compositional model is shown to allow for better representation of the uncertainty on the priors and for greater flexibility of the model. The relevance of our approach is demonstrated on multi-modal traffic scene data from the KITTI benchmark suite.  相似文献   

19.
On the revision of probabilistic beliefs using uncertain evidence   总被引:1,自引:0,他引:1  
We revisit the problem of revising probabilistic beliefs using uncertain evidence, and report results on several major issues relating to this problem: how should one specify uncertain evidence? How should one revise a probability distribution? How should one interpret informal evidential statements? Should, and do, iterated belief revisions commute? And what guarantees can be offered on the amount of belief change induced by a particular revision? Our discussion is focused on two main methods for probabilistic revision: Jeffrey's rule of probability kinematics and Pearl's method of virtual evidence, where we analyze and unify these methods from the perspective of the questions posed above.  相似文献   

20.
From the early developments of machines for reasoning and decision making in higher-level information fusion, there was a need for a systematic and reliable evaluation of their performance. Performance evaluation is important for comparison and assessment of alternative solutions to real-world problems. In this paper we focus on one aspect of performance assessment for reasoning under uncertainty: the accuracy of the resulting belief (prediction or estimate). We propose a framework for assessment based on the assumption that the system under investigation is uncertain only due to stochastic variability (randomness), which is partially known. In this context we formulate a distance measure between the “ground truth” and the output of an automated system for reasoning in the framework of one of the non-additive uncertainty formalisms (such as imprecise probability theory, belief function theory or possibility theory). The proposed assessment framework is demonstrated with a simple numerical example.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号