首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到15条相似文献,搜索用时 0 毫秒
1.
The control state reachability problem is decidable for well-structured infinite-state systems like (Lossy) Petri Nets, Vector Addition Systems, and broadcast protocols. An abstract algorithm that solves the problem is the backward reachability algorithm of [1, 21 ]. The algorithm computes the closure of the predecessor operator with respect to a given upward-closed set of target states. When applied to this class of verification problems, symbolic model checkers based on constraints like [7, 26 ] suffer from the state explosion problem.In order to tackle this problem, in [13] we introduced a new data structure, called covering sharing trees, to represent in a compact way collections of infinite sets of system configurations. In this paper, we will study the theoretical complexity of the operations over covering sharing trees needed in symbolic model checking. We will also discuss several optimizations that can be used when dealing with Petri Nets. Among them, in [14] we introduced a new heuristic rule based on structural properties of Petri Nets that can be used to efficiently prune the search during symbolic backward exploration. The combination of these techniques allowed us to turn the abstract algorithm of [1, 21 ] into a practical method. We have evaluated the method on several finite-state and infinite-state examples taken from the literature [2, 18 , 20 , 30 ]. In this paper, we will compare the results we obtained in our experiments with those obtained using other finite and infinite-state verification tools.  相似文献   

2.
基于图像局部几何结构的SAR图像降噪与增强*   总被引:1,自引:0,他引:1  
陆丹  唐娉  郭彤 《计算机应用研究》2009,26(12):4841-4843
研究了基于图像局部几何结构对SAR图像进行各向异性扩散滤波降噪。首先,回顾各向异性扩散滤波的PM模型、Weickert模型和Tschumperle的迹模型,分析指出迹模型能够依据图像局部几何结构进行定向扩散滤波且扩散程度由扩散率函数决定,扩散过程可控,意义直观;继而,根据扩散系数的构建原则,构建了新的兼容图像增强、扩散幅度可调整的扩散率函数,并用于SAR图像降噪。实验结果表明,运用此函数不仅有效抑制了相干斑噪声,还保持并增强了边缘细节,取得了理想的效果。  相似文献   

3.
A dataset of spectral signatures (leaf level) of tropical dry forest trees and lianas and an airborne hyperspectral image (crown level) are used to test three hyperspectral data reduction techniques (principal component analysis, forward feature selection and wavelet energy feature vectors) along with pattern recognition classifiers to discriminate between the spectral signatures of lianas and trees. It was found at the leaf level the forward waveband selection method had the best results followed by the wavelet energy feature vector and a form of principal component analysis. For the same dataset our results indicate that none of the pattern recognition classifiers performed the best across all reduction techniques, and also that none of the parametric classifiers had the overall lowest training and testing errors. At the crown level, in addition to higher testing error rates (7%), it was found that there was no optimal data reduction technique. The significant wavebands were also found to be different between the leaf and crown levels. At the leaf level, the visible region of the spectrum was the most important for discriminating between lianas and trees whereas at the crown level the shortwave infrared was also important in addition to the visible and near infrared.  相似文献   

4.
Recently, there has been increasing development of positioning technology, which enables us to collect large scale trajectory data for moving objects. Efficient processing and analysis of massive trajectory data has thus become an emerging and challenging task for both researchers and practitioners. Therefore, in this paper, we propose an efficient data processing framework for mining massive trajectory data. This framework includes three modules: (1) a data distribution module, (2) a data transformation module, and (3) a high performance I/O module. Specifically, we first design a two-step consistent hashing algorithm, which takes into account load balancing, data locality, and scalability, for a data distribution module. In the data transformation module, we present a parallel strategy of a linear referencing algorithm with reduced subtask coupling, easy-implemented parallelization, and low communication cost. Moreover, we propose a compression-aware I/O module to improve the processing efficiency. Finally, we conduct a comprehensive performance evaluation on a synthetic dataset (1.114 TB) and a real world taxi GPS dataset (578 GB). The experimental results demonstrate the advantages of our proposed framework.  相似文献   

5.
于鸽  冯山 《计算机应用》2016,36(6):1645-1649
针对保证实时数据对象时序一致性调度算法在软实时数据库系统环境下的应用问题,提出了一种基于概率统计的可延迟优化(SDS-OPT)算法。首先,分析和比较了现有算法在可调度性、服务质量(QoS)以及工作负载方面的特征与不足,指出优化现有算法的必要性;然后,利用最速下降法提升作业的执行时间筛选基准值,进而增加实时更新事务可调度的作业数量,以确保实时数据对象的时序一致性服务质量(QoS)最大化;最后,从工作负载和服务质量两个方面对所提算法和现有算法的性能进行对比分析。仿真实验结果表明,相对于已有的针对固定优先级可延迟调度算法(DS-FP)和统计性的非确定性可延迟调度算法(DS-PS),所提算法能够保证实时数据对象的时序一致性,同时降低工作负载,服务质量提升明显。  相似文献   

6.
Pressure mapping smart textile is a new type of sensing modality that transforms the pressure distribution over surfaces into digital ”image” and ”video”, that has rich application scenarios in Human Activity Recognition (HAR), because all human activities are linked with force change over certain surfaces. To speed up its application exploration, we propose a toolkit named LwTool for the data processing, including: (a) a feature library, including 1830 ready-to-use temporal and spatial features, (b) a hierarchical feature selection framework that automatically picks out the best features for a new application from the feature library. As real-time processing capability is important for instant user feedback, we emphasize not only on good recognition result but also on reducing time cost when selecting features. Our library and algorithms are validated on Smart-Toy and Smart-Bedsheet applications, an 89.7% accuracy for Smart-Toy and an 83.8% accuracy for Smart-Bedsheet can be achieved (10-fold cross-validation) using our feature library. Adopting the feature selection algorithm, the processing speed is increased by more than 3 times while maintaining high accuracy for both two applications. We believe our method could be a general and powerful toolkit in building real-time recognition software systems for pressure mapping smart textile.  相似文献   

7.
Labelling the lines of a planar line drawing of a 3-D object in a way that reflects the geometric properties of the object is a much studied problem in computer vision, considered to be an important step towards understanding the object from its 2-D drawing. Combinatorially, the labellability problem is a Constraint Satisfaction Problem and has been shown to be NP-complete even for images of polyhedral scenes. In this paper, we examine scenes that consist of a set of objects each obtained by rotating a polygon around an arbitrary axis. The objects are allowed to arbitrarily intersect or overlay. We show that for these scenes, there is a sequential lineartime labelling algorithm. Moreover, we show that the algorithm has a fast parallel version that executes inO(log3 n) time on an Exclusive-Read-Exclusive-Write Parallel Random Access Machine withO(n 3/log3 n) processors. The algorithm not only answers the decision problem of labellability, but also produces a legal labelling, if there is one. This parallel algorithm should be contrasted with the techniques of dealing with special cases of the constraint satisfaction problem. These techniques employ an effective, but inherently sequential, relaxation procedure in order to restrict the domains of the variables.This research was partially supported by the European Community ESPRIT Basic Research Program under contracts 7141 (project ALCOM II) and 6019 (project Insight II).  相似文献   

8.
Data analysis techniques have been traditionally conceived to cope with data described in terms of numeric vectors. The reason behind this fact is that numeric vectors have a well-defined and clear geometric interpretation, which facilitates the analysis from the mathematical viewpoint. However, the state-of-the-art research on current topics of fundamental importance, such as smart grids, networks of dynamical systems, biochemical and biophysical systems, intelligent trading systems, multimedia content-based retrieval systems, and social networks analysis, deal with structured and non-conventional information characterizing the data, providing richer and hence more complex patterns to be analyzed. As a consequence, representing patterns by complex (relational) structures and defining suitable, usually non-metric, dissimilarity measures is becoming a consolidated practice in related fields. However, as the data sources become more complex, the capability of judging over the data quality (or reliability) and related interpretability issues can be seriously compromised. For this purpose, automated methods able to synthesize relevant information, and at the same time rigorously describe the uncertainty in the available datasets, are very important: information granulation is the key aspect in the analysis of complex data. In this paper, we discuss our general viewpoint on the adoption of information granulation techniques in the general context of soft computing and pattern recognition, conceived as a fundamental approach towards the challenging problem of automatic modeling of complex systems. We focus on the specific setting of processing the so-called non-geometric data, which diverges significantly from what has been done so far in the related literature. We highlight the motivations, the founding concepts, and finally we provide the high-level conceptualization of the proposed data analysis framework.  相似文献   

9.
The past decade has seen an increase in the capability of the computer to do cognitive tasks such as understanding natural language or interpreting pictures. The programs doing such processing have much in common with theorem-proving programs, operating system optimizing algorithms and methods of problem solving in formal grammars.The present tutorial describes an approach to interpreting a ‘3-view’ drawing for the construction of its ‘3-D’ representation.  相似文献   

10.
A single-channel queuing system with a Poisson incoming flow of objects is considered. Each object consists of several spaced requests. A simple ergodicity condition is established. __________ Translated from Kibernetika i Sistemnyi Analiz, No. 5, pp. 8–12, September–October 2007.  相似文献   

11.
A new scheme for the optimization of codebook sizes for Hidden Markov Models (HMMs) and the generation of HMM ensembles is proposed in this paper. In a discrete HMM, the vector quantization procedure and the generated codebook are associated with performance degradation. By using a selected clustering validity index, we show that the optimization of HMM codebook size can be selected without training HMM classifiers. Moreover, the proposed scheme yields multiple optimized HMM classifiers, and each individual HMM is based on a different codebook size. By using these to construct an ensemble of HMM classifiers, this scheme can compensate for the degradation of a discrete HMM.
Alceu de Souza Britto Jr.Email:
  相似文献   

12.
13.
This paper describes a qualitative technique for interpreting graphical data. Given a set of numerical observations regarding the behaviour of a system, its attributes can be determined by plotting the data and qualitatively comparing the shape of the resulting graph with graphs of system behaviour models. Qualitative data modeling incorporates techniques from pattern recognition and qualitative reasoning to characterize observed data, generate hypothetical interpretations, and select models that best fit the shape of the data. Domain-specific knowledge may be used to substantiate or refute the likelihood of hypothesized interpretations. The basic data modeling technique is domain independent and is applicable to a wide range of problems. It is illustrated here in the context of a knowledge-based system for well test interpretation.  相似文献   

14.
Abstract.  Researchers report mixed findings on the successful application of information technologies (IT) for knowledge management (KM). The primary difficulty is argued to be the use of information management techniques and concepts to design and develop KM Tools. Also problematic is the existence of a multiplicity of KM technologies, the application and use of which differs across organizations. This paper argues that these problems stem, in part, from the information system field's over-reliance on design concepts from the functionalist paradigm. Hence, our contention that alternative perspectives, which bring into focus issues of ontology and epistemology, need to be brought to bear in order to understand the challenges involved in the design and deployment of IT artefacts in knowledge management systems (KMS). The philosophy of technology, with its emphasis on the primacy of praxis, and which incorporates ontological and epistemological concepts from phenomenology and hermeneutics, is applied to the findings of a participative action research study to illustrate how social actors interpret and understand worldly phenomena and subsequently share their knowledge of the life-world using IT. The outcome of this marriage of situated practical theory and philosophy is a set of design principles to guide the development of a core KM Tool for KMS.  相似文献   

15.
Data I/O has become a major bottleneck of computational performance of geospatial analysis and modeling. In this study, a parallel GeoTIFF I/O library (pGTIOL) was developed. Through the storage mapping and data arrangement techniques, pGTIOL can operate on files in either strip or tile storage mode, read/write any sub-domain of data within the raster dataset. pGTIOL enables asynchronized I/O, which means a process can read/write its own sub-domains of data when necessary without synchronizing with other processes. pGTIOL was integrated into the parallel raster processing library (pRPL). Several pGTIOL-based data I/O functions and options were added to pRPL, while the existing functions of pRPL stay intact. Experiments showed that the integration of pRPL and pGTIOL achieved higher performance than the original pRPL that uses GDAL as the I/O interface. Therefore, pRPL + pGTIOL enables transparent parallelism for high-performance raster processing with the capability of true parallel I/O of massive raster datasets.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号