首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
A new model for supervised classification based on probabilistic decision graphs is introduced. A probabilistic decision graph (PDG) is a graphical model that efficiently captures certain context specific independencies that are not easily represented by other graphical models traditionally used for classification, such as the Naïve Bayes (NB) or Classification Trees (CT). This means that the PDG model can capture some distributions using fewer parameters than classical models. Two approaches for constructing a PDG for classification are proposed. The first is to directly construct the model from a dataset of labelled data, while the second is to transform a previously obtained Bayesian classifier into a PDG model that can then be refined. These two approaches are compared with a wide range of classical approaches to the supervised classification problem on a number of both real world databases and artificially generated data.  相似文献   

2.
New hyperspectral sensors can collect a large number of spectral bands, which provide a capability to distinguish various objects and materials on the earth. However, the accurate classification of these images is still a big challenge. Previous studies demonstrate the effectiveness of combination of spectral data and spatial information for better classification of hyperspectral images. In this article, this approach is followed to propose a novel three-step spectral–spatial method for classification of hyperspectral images. In the first step, Gabor filters are applied for texture feature extraction. In the second step, spectral and texture features are separately classified by a probabilistic Support Vector Machine (SVM) pixel-wise classifier to estimate per-pixel probability. Therefore, two probabilities are obtained for each pixel of the image. In the third step, the total probability is calculated by a linear combination of the previous probabilities on which a control parameter determines the efficacy of each one. As a result, one pixel is assigned to one class which has the highest total probability. This method is performed in multivariate analysis framework (MAF) on which one pixel is represented by a d-dimensional vector, d is the number of spectral or texture features, and in functional data analysis (FDA) on which one pixel is considered as a continuous function. The proposed method is evaluated with different training samples on two hyperspectral data. The combination parameter is experimentally obtained for each hyperspectral data set as well as for each training samples. This parameter adjusts the efficacy of the spectral versus texture information in various areas such as forest, agricultural or urban area to get the best classification accuracy. Experimental results show high performance of the proposed method for hyperspectral image classification. In addition, these results confirm that the proposed method achieves better results in FDA than in MAF. Comparison with some state-of-the-art spectral–spatial classification methods demonstrates that the proposed method can significantly improve classification accuracies.  相似文献   

3.
A probabilistic relaxation model is used to improve maximum likelihood classifications of LANDSAT data of arid and urban areas in and around Ai Jahra, Kuwait. The problems of urban pixels, the role of compatibility coefficients and the iterations of the model are presented and discussed.  相似文献   

4.
5.
Li  Dan  Hu  Disheng  Sun  Yuke  Hu  Yingsong 《Multimedia Tools and Applications》2018,77(21):28417-28440
Multimedia Tools and Applications - In this paper, texture probabilistic grammar is defined for the first time. We have developed an algorithm to obtain the 3D information in a 2D scene by training...  相似文献   

6.
Geometric constraint satisfaction using optimization methods   总被引:15,自引:0,他引:15  
The numerical approach to solving geometric constraint problems is indispensable for building a practical CAD system. The most commonly-used numerical method is the Newton–Raphson method. It is fast, but has the instability problem: the method requires good initial values. To overcome this problem, recently the homotopy method has been proposed and experimented with. According to the report, the homotopy method generally works much better in terms of stability. In this paper we use the numerical optimization method to deal with the geometric constraint solving problem. The experimental results based on our implementation of the method show that this method is also much less sensitive to the initial value. Further, a distinctive advantage of the method is that under- and over-constrained problems can be handled naturally and efficiently. We also give many instructive examples to illustrate the above advantages.  相似文献   

7.
We propose an information filtering system based on a probabilistic model. We make an assumption that a document consists of words which occur according to a probability distribution, and regard a document as a sample drawn according to that distribution. In this article, we adopt a multinomial distribution and represent a document as probability which has random values as the words in the document. When an information filtering system selects information, it uses the similarity between the user's interests (a user profile) and a document. Since our proposed system is constructed under the probabilistic model, the similarity is defined using the Kullback Leibler divergence. To create the user profile, we must optimize the Kullback Leibler divergence. Since the Kullback Leibler divergence is a nonlinear function, we use a genetic algorithm to optimize it. We carry out experiments and confirm effectiveness of the proposed method. This work was presented in part at the 10th International Symposium on Artificial Life and Robotics, Oita, Japan, February 4–6, 2005  相似文献   

8.
We combine the concept of evolutionary search with the systematic search concepts of arc revision and hill climbing to form a hybrid system that quickly finds solutions to static and dynamic constraint satisfaction problems (CSPs). Furthermore, we present the results of two experiments. In the first experiment, we show that our evolutionary hybrid outperforms a well-known hill climber, the iterative descent method (IDM), on a test suite of 750 randomly generated static CSPs. These results show the existence of a “mushy region” which contains a phase transition between CSPs that are based on constraint networks that have one or more solutions and those based on networks that have no solution. In the second experiment, we use a test suite of 250 additional randomly generated CSPs to compare two approaches for solving CSPs. In the first method, all the constraints of a CSP are known by the hybrid at run-time. We refer to this method as the static method for solving CSPs. In the second method, only half of the constraints of a CSPs are known at run-time. Each time that our hybrid system discovers a solution that satisfies all of the constraints of the current network, one additional constraint is added. This process of incrementally adding constraints is continued until all the constraints of a CSP are known by the algorithm or until the maximum number of individuals has been created. We refer to this second method as the dynamic method for solving CSPs. Our results show hybrid evolutionary search performs exceptionally well in the presence of dynamic (incremental) constraints, then also illuminate a potential hazard with solving dynamic CSPs  相似文献   

9.
In port container terminals, the efficient scheduling of operators of handling equipment such as container cranes, yard cranes, and yard trucks is important. Because of many complicated constraints, finding a feasible solution, as opposed to the optimal solution, within a reasonable amount of computing time can be considered satisfactory from a practical point of view. The major constraints include the following: restrictions on the minimum workforce assignment to each time slot, the maximum total operating time per operator per shift, the minimum and maximum consecutive operating times for an operator, types of equipment that can be assigned to each operator, and the available time slots for each operator or piece of equipment. The operator-scheduling problem is defined as a constraint-satisfaction problem, and its solution is obtained by utilizing a commercial software. An actual problem, collected from a container terminal in Pusan, Korea, is solved through the solution procedure proposed in this study.  相似文献   

10.
Architectural floor plan layout design is what architects and designers do when they conceptually combine design units, such as rooms or compartments. At the end of this activity, they deliver precise geometric schemas as solutions to particular problems. More research on this topic is needed to develop productive tools. The authors propose orthogonal compartment placement (OCP) as a new approach to this activity. OCP includes a problem formulation and a solution method in which qualitative and quantitative knowledge are combined. Topological knowledge underlies human spatial reasoning. Computers can adequately perform repetitive topological reasoning. We believe that OCP is the first approach in CAAD to incorporate a full relational algebra to generate floor plan layouts. Based on block algebra (BA) and constraint satisfaction (CS), OCP can generate candidate solutions that correspond to distinct topological options. The analysis of a case study using a prototype tool is included.  相似文献   

11.
Texture classification is an important problem in image analysis. In the present study, an efficient strategy for classifying texture images is introduced and examined within a distributional-statistical framework. Our approach incorporates the multivariate Wald–Wolfowitz test (WW-test), a non-parametric statistical test that measures the similarity between two different sets of multivariate data, which is utilized here for comparing texture distributions. By summarizing the texture information using standard feature extraction methodologies, the similarity measure provides a comprehensive estimate of the match between different images based on graph theory. The proposed “distributional metric” is shown to handle efficiently the texture-space dimensionality and the limited sample size drawn from a given image. The experimental results, from the application on a typical texture database, clearly demonstrate the effectiveness of our approach and its superiority over other well-established texture distribution (dis)similarity metrics. In addition, its performance is used to evaluate several approaches for texture representation. Even though the classification results are obtained on grayscale images, a direct extension to color-based ones can be straightforward.
George EconomouEmail:

Vasileios K. Pothos   received the B.Sc. degree in Physics in 2004 and the M.Sc. degree in Electronics and Information Processing in 2006, both from the University of Patras (UoP), Greece. He is currently a Ph.D. candidate in image processing at the Electronics Laboratory in the Department of Physics, UoP, Greece. His main research interests include image processing, pattern recognition and multimedia databases. Dr. Christos Theoharatos   received the B.Sc. degree in Physics in 1998, the M.Sc. degree in Electronics and Computer Science in 2001 and the Ph.D. degree in Image Processing and Multimedia Retrieval in 2006, all from the University of Patras (UoP), Greece. He has actively participated in several national research projects and is currently working as a PostDoc researcher at the Electronics Laboratory (ELLAB), Electronics and Computer Division, Department of Physics, UoP. Since the academic year 2002, he has been working as tutor at the degree of lecturer in the Department of Electrical Engineering, of the Technological Institute of Patras. His main research interests include pattern recognition, multimedia databases, image processing and computer vision, data mining and graph theory. Prof. Evangelos Zygouris   received the B.Sc. degree in Physics in 1971 and the Ph.D. degree in Digital Filters and Microprocessors in 1984, both from the University of Patras (UoP), Greece. He is currently an Associate Professor at Electronics Laboratory (ELLAB), Department of Physics, UoP, where he teaches at both undergraduate and postgraduate level. He has published papers on digital signal and image processing, digital system design, speech coding systems and real-time processing. His main research interests include digital signal and image processing, DSP system design, micro-controllers, micro-processors and DSPs using VHDL. Prof. George Economou   received the B.Sc. degree in Physics from the University of Patras (UoP), Greece in 1976, the M.Sc. degree in Microwaves and Modern Optics from University College London in 1978 and the Ph.D. degree in Fiber Optic Sensor Systems from the University of Patras in 1989. He is currently an Associate Professor at Electronics Laboratory (ELLAB), Department of Physics, UoP, where he teaches at both undergraduate and postgraduate level. He has published papers on non-linear signal and image processing, fuzzy image processing, multimedia databases, data mining and fiber optic sensors. He has also served as referee for many journals, conferences and workshops. His main research interests include signal and image processing, computer vision, pattern recognition and optical signal processing.   相似文献   

12.
Multimedia Tools and Applications - This paper proposes a simple yet effective novel classifier fusion strategy for multi-class texture classification. The resulting classification framework is...  相似文献   

13.
14.
Data mining techniques often ask for the resolution of optimization problems. Supervised classification, and, in particular, support vector machines, can be seen as a paradigmatic instance. In this paper, some links between mathematical optimization methods and supervised classification are emphasized. It is shown that many different areas of mathematical optimization play a central role in off-the-shelf supervised classification methods. Moreover, mathematical optimization turns out to be extremely useful to address important issues in classification, such as identifying relevant variables, improving the interpretability of classifiers or dealing with vagueness/noise in the data.  相似文献   

15.

This study presents an alternative way of classifying the different productive items of a company. A fuzzy model for the magnitudes involved (demand and cost) is described. This model contrasts with the classic Pareto classification (ABC), which ranks productive items according to their importance in terms of frequency and costs. Whereas rankings obtained using the classical method are based on information about costs and demand over a period in the past, this new method allows new fuzzy information about the future to be included, thus allowing stricter control of the fuzzy ''A-items'' that result from this new classification. Rankings comparing a probabilistic model and its fuzzy counterpart are also provided in this study.  相似文献   

16.
Object and texture classification using higher order statistics   总被引:5,自引:0,他引:5  
The problem of the detection and classification of deterministic objects and random textures in a noisy scene is discussed. An energy detector is developed in the cumulant domain by exploiting the noise insensitivity of higher order statistics. An efficient implementation of this detector is described, using matched filtering. Its performance is analyzed using asymptotic distributions in a binary hypothesis-testing framework. The object and texture discriminant functions are minimum distance classifiers in the cumulant domain and can be efficiently implemented using a bank of matched filters. They are immune to additive Gaussian noise and insensitive to object shifts. Important extensions, which can handle object rotation and scaling, are also discussed. An alternative texture classifier is derived from a ML viewpoint and is statistically efficient at the expense of complexity. The application of these algorithms to the texture-modeling problem is indicated, and consistent parameter estimates are obtained  相似文献   

17.
A new method using fuzzy uncertainty, which measures the uncertainty of the uniform surface in an image, is proposed for texture analysis. A grey-scale image can be transformed into a fuzzy image by the uncertainty definition. The distribution of the membership in a measured fuzzy image, denoted by the fuzzy uncertainty texture spectrum (FUTS), is used as the texture feature for texture analysis. To evaluate the performance of the proposed method. supervised texture classification and rotated texture classification are applied. Experimental results reveal high-accuracy classification rates and show that the proposed method is a good tool for texture analysis.  相似文献   

18.
19.
Self-care problems classification is one of the important challenges for occupational therapists. Extent and variety of disorders make the self-care problems classification process complex and time-consuming. To overcome this challenge, an expert model is proposed innovatively in this research. The proposed model is based on Probabilistic Neural Network (PNN) and Genetic Algorithm (GA) for classifying self-care problems of children with physical and motor disability. In this model, PNN is employed as a classifier and GA is applied for feature selection. The PNN is trained by using a standard ICF-CY dataset. Based on ICF-CY, occupational therapists must evaluate many features to diagnose self-care problems. According to the experiences of occupational therapists, these features have different effects on classification. Hence, GA is employed to select relevant and important features in self-care problems classification. Since the classification rules are important for occupational therapists, the self-care problems classification rules are extracted additionally by using the CART algorithm. The experimental results show that by using the feature selection algorithm, the accuracy and time complexity of classification are improved in comparison to other models. The proposed model can classify self-care problems of children with 94.28% accuracy by using only 16.5% of all features.  相似文献   

20.
Backtracking and random constraint satisfaction   总被引:2,自引:0,他引:2  
The average running time used by backtracking on random constraint satisfaction problems is studied. This time is polynomial when the ratio of constraints to variables is large, and it is exponential when the ratio is small. When the number of variables goes to infinity, whether the average time is exponential or polynomial depends on the number of variables per constraint, the number of values per variable, and the probability that a random setting of variables satisfies a constraint. A method for computing the curve that separates polynomial from exponential time and several methods for approximating the curve are given. The version of backtracking studied finds all solutions to a problem, so the running time is exponential when the number of solutions per problem is exponential. For small values of the probability, the curve that separates exponential and polynomial average running time coincides with the curve that separates an exponential average number of solutions from a polynomial number. For larger probabilities the two curves diverge. Random problems similar to those that arise in understanding line drawings with shadows require a time that is mildly exponential when they are solved by simple backtracking. Slightly more sophisticated algorithms (such as constraint propagation combined with backtracking) should be able to solve these rapidly. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号