首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Combining multiple clusterings using evidence accumulation   总被引:2,自引:0,他引:2  
We explore the idea of evidence accumulation (EAC) for combining the results of multiple clusterings. First, a clustering ensemble - a set of object partitions, is produced. Given a data set (n objects or patterns in d dimensions), different ways of producing data partitions are: 1) applying different clustering algorithms and 2) applying the same clustering algorithm with different values of parameters or initializations. Further, combinations of different data representations (feature spaces) and clustering algorithms can also provide a multitude of significantly different data partitionings. We propose a simple framework for extracting a consistent clustering, given the various partitions in a clustering ensemble. According to the EAC concept, each partition is viewed as an independent evidence of data organization, individual data partitions being combined, based on a voting mechanism, to generate a new n /spl times/ n similarity matrix between the n patterns. The final data partition of the n patterns is obtained by applying a hierarchical agglomerative clustering algorithm on this matrix. We have developed a theoretical framework for the analysis of the proposed clustering combination strategy and its evaluation, based on the concept of mutual information between data partitions. Stability of the results is evaluated using bootstrapping techniques. A detailed discussion of an evidence accumulation-based clustering algorithm, using a split and merge strategy based on the k-means clustering algorithm, is presented. Experimental results of the proposed method on several synthetic and real data sets are compared with other combination strategies, and with individual clustering results produced by well-known clustering algorithms.  相似文献   

2.
Dempster's rule plays a central role in the theory of belief functions. However, it assumes the combined bodies of evidence to be distinct, an assumption which is not always verified in practice. In this paper, a new operator, the cautious rule of combination, is introduced. This operator is commutative, associative and idempotent. This latter property makes it suitable to combine belief functions induced by reliable, but possibly overlapping bodies of evidence. A dual operator, the bold disjunctive rule, is also introduced. This operator is also commutative, associative and idempotent, and can be used to combine belief functions issues from possibly overlapping and unreliable sources. Finally, the cautious and bold rules are shown to be particular members of infinite families of conjunctive and disjunctive combination rules based on triangular norms and conorms.  相似文献   

3.
This paper describes conditioned Dempster-Shafer (CDS) theory, a probabilistic calculus for dealing with possibly non-Bayesian evidence when the underlying a priori knowledge base is possibly non-Bayesian. Specifically, we show that the Dempster-Shafer combination operator can be “conditioned” to reflect the influence of any a priori knowledge base which can be modeled by a Dempster-Shafer belief measure. We show that CDS is firmly grounded in probability theory-specifically, in the theory of random sets. We also show that it is a generalization of the Bayesian theory to the case when both evidence and a priori knowledge are ambiguous. We derive the algebraic properties of the theory when a priori knowledge is assumed fixed. Under this assumption, we also derive the form of CDS in the special case when fixed a priori knowledge is Bayesian  相似文献   

4.
Electricity markets depend on upstream energy markets to supply the fuels needed for generation. Since these markets rely on networks, congestion in one can quickly produce changes in another. In this paper we develop a combined partial equilibrium market model which includes the interactions of natural gas and electricity networks. We apply the model to a stylized representation of Europe??s electricity and natural gas markets to illustrate the upstream and downstream feedback effects which are not obvious on first sight. We find that both congestion and loop-flow effects in electricity markets impact prices and quantities in markets located far from the initial cause of the market changes.  相似文献   

5.
This paper shows that shared-coin algorithms can be combined to optimize several complexity measures, even in the presence of a strong adversary. By combining shared coins of Bracha and Rachman (1991) [10] and of Aspnes and Waarts (1996) [7], this yields a shared-coin algorithm, and hence, a randomized consensus algorithm, with O(nlog2n)O(nlog2n) individual work and O(n2logn)O(n2logn) total work, using single-writer registers. This improves upon each of the above shared coins (where the former has a high cost for individual work, while the latter reduces it but pays in the total work), and is currently the best for this model.  相似文献   

6.
运用实际案例 ,阐述了在把已经存在的两个 (或多个 )局域网合并为一个局域网的过程中 ,相关交换机的配置要求及操作方法。  相似文献   

7.
通过素描,绘画和时尚照片的混合来制作丰富的图像。Alexis West将为我们介绍一些非常有价值的技术。  相似文献   

8.
We introduce a quantifier-free set-theoretic language for combining sets with elements in the presence of the cardinality operator. We prove that the language is decidable by providing a combination method specifically tailored to the combination domain of sets, cardinal numbers, and elements. Our method uses as black boxes a decision procedure for the elements and a decision procedure for cardinal numbers. To be correct, our method requires that the theory of elements be stably infinite. However, we show that if we restrict set variables to range over finite sets only, then one can modify our method so that it works even when the theory of the elements is not stably infinite.  相似文献   

9.
The Nelson-Oppen combination method combines decision procedures for first-order theories over disjoint signatures into a single decision procedure for the union theory. To be correct, the method requires that the component theories be stably infinite. This restriction makes the method inapplicable to many interesting theories such as, for instance, theories having only finite models.In this paper we provide a new combination method that can combine any theory that is not stably infinite with another theory, provided that the latter is what we call a shiny theory. Examples of shiny theories include the theory of equality, the theory of partial orders, and the theory of total orders.An interesting consequence of our results is that any decision procedure for the satisfiability of quantifier-free Σ-formulae in a Σ-theory T can always be extended to accept inputs over an arbitrary signature Ω Σ.  相似文献   

10.
11.
Combining multiple knowledge bases   总被引:2,自引:0,他引:2  
Combining knowledge present in multiple knowledge base systems into a single knowledge base is discussed. A knowledge based system can be considered an extension of a deductive database in that it permits function symbols as part of the theory. Alternative knowledge bases that deal with the same subject matter are considered. The authors define the concept of combining knowledge present in a set of knowledge bases and present algorithms to maximally combine them so that the combination is consistent with respect to the integrity constraints associated with the knowledge bases. For this, the authors define the concept of maximality and prove that the algorithms presented combine the knowledge bases to generate a maximal theory. The authors also discuss the relationships between combining multiple knowledge bases and the view update problem  相似文献   

12.
The Nelson–Oppen combination method combines decision procedures for first-order theories over disjoint signatures into a single decision procedure for the union theory. In order to be correct, the method requires that the component theories be stably infinite. This restriction makes the method inapplicable to many interesting theories such as, for instance, theories having only finite models. In this paper, we describe two extensions of the Nelson–Oppen method that address the problem of combining theories that are not stably infinite. In our extensions, the component decision procedures exchange not only equalities between shared variables but also certain cardinality constraints. Applications of our results include the combination of theories having only finite models, as well as the combination of nonstably infinite theories with the theory of equality, the theories of total and partial orders, and the theory of lattices with maximum and minimum. Calogero G. Zarba: Work done by this author at Stanford University and later at LORIA and INRIA-Lorraine.  相似文献   

13.
A large number of digitized surface water bodies are automatically distributed on the basis of size and shape by performing an opening transformation. In addition, an iteraled bisecting process is applied to construct self-similar size distribution of water bodies.  相似文献   

14.
Rising complexity of industrial development in the automotive industry is leading to a higher degree of interdisciplinarity, which is especially true in the virtual design area. New methods and solution procedures have to be evaluated and integrated in the overall process. For example, in car body design process, a new topic emerged recently: the multidisciplinary optimization of car bodies with respect to crash and NVH (noise, vibration, and harshness). Because rigorous evaluation of appropriate numerical algorithms is still missing, an intense study was realized at the research center of BMW. The results are summarized in this article. Four benchmarks have been studied: (a) a full vehicle model for NVH analysis, (b) a simplified multidisciplinary problem with a single crash case and linear statics and dynamics, (c) a lateral impact problem for multi-criteria optimization, and finally, (d) a small shape optimization problem was included to demonstrate the potential of transferring the results to the more complex problem of optimizations based on real changes in the shape of the structures. Because response surface methods have already been discussed in the literature and because of their failure in certain industrial cases, the focus was set on the evaluation of stochastic algorithms: simulated annealing, genetic and evolutionary algorithms were tested. Finally, a complete industrial multidisciplinary example from the current development process was studied for the validation of the results.  相似文献   

15.
Shavlik  Jude W. 《Machine Learning》1994,14(3):321-331
Conclusion Connectionist machine learning has proven to be a fruitful approach, and it makes sense to investigate systems that combine the strengths of the symbolic and connectionist approaches to AI. Over the past few years, researchers have successfully developed a number of such systems. This article summarizes one view of this endeavor, a framework that encompasses the approaches of several different research groups. This framework (see Figure 1) views the combination of symbolic and neural learning as a three-stage process: (1) the insertion of symbolic information into a neural network, thereby (partially) determining the topology and initial weight settings of a network, (2) the refinement of this network using a numeric optimization method such as backpropagation, possibly under the guidance of symbolic knowledge, and (3) the extraction of symbolic rules that accurately represent the knowledge contained in a trained network. These three components form an appealing, complete picture—approximately-correct symbolic information in, more-accurate symbolic information out—however, these three stages can be independently studied. In conclusion, the research summarized in this paper demonstrates that combining symbolic and connectionist methods is a promising approach to machine learning.  相似文献   

16.
Confidence Transformation for Combining Classifiers   总被引:1,自引:0,他引:1  
This paper investigates a number of confidence transformation methods for measurement-level combination of classifiers. Each confidence transformation method is the combination of a scaling function and an activation function. The activation functions correspond to different types of confidences: likelihood (exponential), log-likelihood, sigmoid, and the evidence combination of sigmoid measures. The sigmoid and evidence measures serve as approximates to class probabilities. The scaling functions are derived by Gaussian density modeling, logistic regression with variable inputs, etc. We test the confidence transformation methods in handwritten digit recognition by combining variable sets of classifiers: neural classifiers only, distance classifiers only, strong classifiers, and mixed strong/weak classifiers. The results show that confidence transformation is efficient to improve the combination performance in all the settings. The normalization of class probabilities to unity of sum is shown to be detrimental to the combination performance. Comparing the scaling functions, the Gaussian method and the logistic regression perform well in most cases. Regarding the confidence types, the sigmoid and evidence measures perform well in most cases, and the evidence measure generally outperforms the sigmoid measure. We also show that the confidence transformation methods are highly robust to the validation sample size in parameter estimation.  相似文献   

17.
To solve a problem one may need to combine the knowledge of several different experts. It can happen that some of the claims of one or more experts may be in conflict with the claims of other experts. There may be several such points of conflict and any claim may be involved in several different such points of conflict. In that case, the user of the knowledge of experts may prefer a certain claim to another in one conflict-point without necessarily preferring that statement in another conflict-point.Our work constructs a framework within which the consequences of a set of such preferences (expressed as priorities among sets of statements) can be computed. We give four types of semantics for priorities, three of which are shown to be equivalent to one another. The fourth type of semantics for priorities is shown to be more cautious than the other three. In terms of these semantics for priorities, we give a function for combining knowledge from different sources such that the combined knowledge is conflict-free and satisfies all the priorities.Jack Minker and Shekhar Pradhan were supported in part by the National Science Foundation grant IRI-89-16059 and Air Force Office of Scientific Research grant 91-0350. V.S. Subrahmanian was supported in part by Army Research Office grant DAAL-03-92-G-0225, Air Force Office of Scientific Research Grant F49620-93-1-0065, and NSF grant IRI-9109755.  相似文献   

18.
An iterative procedure is described as a generalization of Bayes' method of updating an a priori assignment over the power set of the frame of discernment using uncertain evidence. In the context of probability kinematics the law of commutativity holds and the convergence is well behaved. the probability assignments of each updating evidence is retained. A general assignment method is also discussed for combining evidences without reference to any prior. the methods described here can be used in the field of Artificial Intelligence for common-sense reasoning and more specifically for treating uncertainty in Expert Systems. They are also relevant for nonmonotonic reasoning, abduction, and learning theory.  相似文献   

19.
Program and data specialization have always been studied separately, although they are both aimed at processing early computations. Program specialization encodes the result of early computations into a new program; while data specialization encodes the result of early computations into data structures.In this paper, we present an extension of the Tempo specializer, which performs both program and data specialization. We show how these two strategies can be integrated in a single specializer. This new kind of specializer provides the programmer with complementary strategies which widen the scope of specialization. We illustrate the benefits and limitations of these strategies and their combination on a variety of programs.  相似文献   

20.
The quest for proved and reliable software goes through theorem provers, which implement proof search. A main trend of these last ten years has been the combination of assisted and automated deduction. Different approaches to this integration are currently studied focussing on efficiency and/or reliability. A promising research direction is to provide formal systems, programming languages and proof environments that support and integrate the two paradigms of computation and deduction. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号