首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 718 毫秒
1.
The creation of links between schemas of published datasets is a key part of the Linked Open Data (LOD) paradigm. The ability to discover these links “on the go” requires that ontology matching techniques achieve good precision and recall within acceptable execution times. In this paper, we add similarity-based and mediator-based ontology matching methods to the Agreementmaker ontology matching system, which aim to efficiently discover high precision subclass mappings between LOD ontologies. Similarity-based matching methods discover subclass mappings by extrapolating them from a set of high quality equivalence mappings and from the interpretation of compound concept names. Mediator-based matching methods discover subclass mappings by comparing polysemic lexical annotations of ontology concepts and by considering external web ontologies. Experiments show that when compared with a leading LOD approach, Agreementmaker achieves considerably higher precision and F-measure, at the cost of a slight decrease in recall.  相似文献   

2.
All the state of the art approaches based on evolutionary algorithm (EA) for addressing the meta-matching problem in ontology alignment require the domain expert to provide a reference alignment (RA) between two ontologies in advance. Since the RA is very expensive to obtain especially when the scale of ontology is very large, in this paper, we propose to use the Partial Reference Alignment (PRA) built by clustering-based approach to take the place of RA in the process of using evolutionary approach. Then a problem-specific Memetic Algorithm (MA) is proposed to address the meta-matching problem by optimizing the aggregation of three different basic similarity measures (Syntactic Measure, Linguistic Measure and Taxonomy based Measure) into a single similarity metric. The experimental results have shown that using PRA constructed by our approach in most cases leads to higher quality of solution than using PRA built in randomly selecting classes from ontology and the quality of solution is very close to the approach using RA where the precision value of solution is generally high. Comparing to the state of the art ontology matching systems, our approach is able to obtain more accurate results. Moreover, our approach’s performance is better than GOAL approach based on Genetic Algorithm (GA) and RA with the average improvement up to 50.61%. Therefore, the proposed approach is both effective.  相似文献   

3.
The ArchiMate modelling language provides a coherent and a holistic view of an enterprise in terms of its products, services, business processes, actors, business units, software applications and more. Yet, ArchiMate currently lacks (1) expressivity in modelling an enterprise from a value exchange perspective, and (2) rigour and guidelines in modelling business processes that realize the transactions relevant from a value perspective. To address these issues, we show how to connect e \(^{3}\) value, a technique for value modelling, to ArchiMate via transaction patterns from the DEMO methodology. Using ontology alignment techniques, we show a transformation between the meta models underlying e \(^{3}\) value, DEMO and ArchiMate. Furthermore, we present a step-wise approach that shows how this model transformation is achieved and, in doing so, we also show the of such a transformation. We exemplify the transformation of DEMO and e \(^{3}\) value into ArchiMate by means of a case study in the insurance industry. As a proof of concept, we present a software tool supporting our transformation approach. Finally, we discuss the functionalities and limitations of our approach; thereby, we analyze its and practical applicability.  相似文献   

4.
With the proliferation of sensors, semantic web technologies are becoming closely related to sensor network. The linking of elements from semantic web technologies with sensor networks is called semantic sensor web whose main feature is the use of sensor ontologies. However, due to the subjectivity of different sensor ontology designer, different sensor ontologies may define the same entities with different names or in different ways, raising so-called sensor ontology heterogeneity problem. There are many application scenarios where solving the problem of semantic heterogeneity may have a big impact, and it is urgent to provide techniques to enable the processing, interpretation and sharing of data from sensor web whose information is organized into different ontological schemes. Although sensor ontology heterogeneity problem can be effectively solved by Evolutionary Algorithm (EA)-based ontology meta-matching technologies, the drawbacks of traditional EA, such as premature convergence and long runtime, seriously hamper them from being applied in the practical dynamic applications. To solve this problem, we propose a novel Compact Co-Evolutionary Algorithm (CCEA) to improve the ontology alignment’s quality and reduce the runtime consumption. In particular, CCEA works with one better probability vector (PV) \(PV_{better}\) and one worse PV \(PV_{worse}\), where \(PV_{better}\) mainly focuses on the exploitation which dedicates to increase the speed of the convergence and \(PV_{worse}\) pays more attention to the exploration which aims at preventing the premature convergence. In the experiment, we use Ontology Alignment Evaluation Initiative (OAEI) test cases and two pairs of real sensor ontologies to test the performance of our approach. The experimental results show that CCEA-based ontology matching approach is both effective and efficient when matching ontologies with various scales and under different heterogeneous situations, and compared with the state-of-the-art sensor ontology matching systems, CCEA-based ontology matching approach can significantly improve the ontology alignment’s quality.  相似文献   

5.
This paper presents a novel hybrid GA-DEA algorithm in order to solve multi-objective \(k\) -out-of- \(n\) problem and determine preferred policy. The proposed algorithm maximizes overall system reliability and availability, while minimizing system cost and queue length, simultaneously. To meet these objectives, an adaptive hybrid GA-DEA algorithm is developed to identify the optimal solutions and improve computation efficiency. In order to improve computation efficiency genetic algorithm (GA) is used to simulate a series production line and find the Pareto-optimal solutions which are different values of \(k\) and \(n\) of \(k\) -out-of- \(n\) problem. Data envelopment analysis is used to find the best \(k\) and \(n\) from Genetic Algorithm’s Pareto solutions. An illustrative example is applied to show the flexibility and effectiveness of the proposed algorithm. The proposed algorithm of this study would help managers to identify the preferred policy considering and investigating various parameters and scenarios in logical time. Also considering different objectives result in Pareto-optimal solutions that would help decision makers to select the preferred solution based on their situation and preference.  相似文献   

6.
With the rapid development of networking technology, grid computing has emerged as a source for satisfying the increasing demand of the computing power of scientific computing community. Mostly, the user applications in scientific and enterprise domains are constructed in the form of workflows in which precedence constraints between tasks are defined. Scheduling of workflow applications belongs to the class of NP-hard problems, so meta-heuristic approaches are preferred options. In this paper, $\varepsilon $ -fuzzy dominance sort based discrete particle swarm optimization ( $\varepsilon $ -FDPSO) approach is used to solve the workflow scheduling problem in the grid. The $\varepsilon $ -FDPSO approach has never been used earlier in grid scheduling. The metric, fuzzy dominance which quantifies the relative fitness of solutions in multi-objective domain is used to generate the Pareto optimal solutions. In addition, the scheme also incorporates a fuzzy based mechanism to determine the best compromised solution. For the workflow applications two scheduling problems are solved. In one of the scheduling problems, we addressed two major conflicting objectives, i.e. makespan (execution time) and cost, under constraints (deadline and budget). While, in other, we optimized makespan, cost and reliability objectives simultaneously in order to incorporate the dynamic characteristics of grid resources. The performance of the approach has been compared with other acknowledged meta-heuristics like non-dominated sort genetic algorithm and multi-objective particle swarm optimization. The simulation analysis substantiates that the solutions obtained with $\varepsilon $ -FDPSO deliver better convergence and uniform spacing among the solutions keeping the computation overhead limited.  相似文献   

7.
The research presented in this article focuses on the development of a multi-objective optimization algorithm based on the differential evolution (DE) concept combined with Mamdani-type fuzzy logic controllers (FLCs) and $K$ -medoids clustering. The FLCs are used for adaptive control of the DE parameters; $K$ -medoids clustering enables the algorithm to perform a more guided search by evolving neighboring vectors, i.e., vectors that belong to the same cluster. A modified version of the $DE/best/1/bin$ algorithm is adopted as the core search component of the multi-objective optimizer. The FLCs utilize Pareto dominance and cluster-related information as input in order to adapt the algorithmic parameters dynamically. The proposed optimization algorithm is tested using a number of problems from the multi-objective optimization literature in order to investigate the effect of clustering and parameter adaptation on the algorithmic performance under various conditions, e.g., problems of high dimensionality, problems with non-convex Pareto fronts, and problems with discontinuous Pareto fronts. A detailed performance comparison between the proposed algorithm with state-of-the-art multi-objective optimizers is also presented.  相似文献   

8.
The Meteor metric for automatic evaluation of machine translation   总被引:1,自引:1,他引:0  
The Meteor Automatic Metric for Machine Translation evaluation, originally developed and released in 2004, was designed with the explicit goal of producing sentence-level scores which correlate well with human judgments of translation quality. Several key design decisions were incorporated into Meteor in support of this goal. In contrast with IBM’s Bleu, which uses only precision-based features, Meteor uses and emphasizes recall in addition to precision, a property that has been confirmed by several metrics as being critical for high correlation with human judgments. Meteor also addresses the problem of reference translation variability by utilizing flexible word matching, allowing for morphological variants and synonyms to be taken into account as legitimate correspondences. Furthermore, the feature ingredients within Meteor are parameterized, allowing for the tuning of the metric’s free parameters in search of values that result in optimal correlation with human judgments. Optimal parameters can be separately tuned for different types of human judgments and for different languages. We discuss the initial design of the Meteor metric, subsequent improvements, and performance in several independent evaluations in recent years.  相似文献   

9.
In this paper, we consider multi-objective evolutionary algorithms for the Vertex Cover problem in the context of parameterized complexity. We consider two different measures for the problem. The first measure is a very natural multi-objective one for the use of evolutionary algorithms and takes into account the number of chosen vertices and the number of edges that remain uncovered. The second fitness function is based on a linear programming formulation and proves to give better results. We point out that both approaches lead to a kernelization for the Vertex Cover problem. Based on this, we show that evolutionary algorithms solve the vertex cover problem efficiently if the size of a minimum vertex cover is not too large, i.e., the expected runtime is bounded by O(f(OPT)?n c ), where c is a constant and f a function that only depends on OPT. This shows that evolutionary algorithms are randomized fixed-parameter tractable algorithms for the vertex cover problem.  相似文献   

10.
Forms are our gates to the Web. They enable us to access the deep content of Web sites. Automatic form understanding provides applications, ranging from crawlers over meta-search engines to service integrators, with a key to this content. Yet, it has received little attention other than as component in specific applications such as crawlers or meta-search engines. No comprehensive approach to form understanding exists, let alone one that produces rich models for semantic services or integration with linked open data. In this paper, we present opal, the first comprehensive approach to form understanding and integration. We identify form labeling and form interpretation as the two main tasks involved in form understanding. On both problems, opal advances the state of the art: For form labeling, it combines features from the text, structure, and visual rendering of a Web page. In extensive experiments on the ICQ and TEL-8 benchmarks and a set of 200 modern Web forms, opal outperforms previous approaches for form labeling by a significant margin. For form interpretation, opal uses a schema (or ontology) of forms in a given domain. Thanks to this domain schema, it is able to produce nearly perfect ( $>$ > 97 % accuracy in the evaluation domains) form interpretations. Yet, the effort to produce a domain schema is very low, as we provide a datalog-based template language that eases the specification of such schemata and a methodology for deriving a domain schema largely automatically from an existing domain ontology. We demonstrate the value of opal’s form interpretations through a light-weight form integration system that successfully translates and distributes master queries to hundreds of forms with no error, yet is implemented with only a handful translation rules.  相似文献   

11.
A non-Hermitian quantum optimization algorithm is created and used to find the ground state of an antiferromagnetic Ising chain. We demonstrate analytically and numerically (for up to $N=1,024$ spins) that our approach leads to a significant reduction in the annealing time that is proportional to $\ln N$ , which is much less than the time (proportional to $N^2$ ) required for the quantum annealing based on the corresponding Hermitian algorithm. We propose to use this approach to achieve similar speed-up for NP-complete problems by using classical computers in combination with quantum algorithms.  相似文献   

12.
In this paper, we introduce a new problem termed query reverse engineering (QRE). Given a database \(D\) and a result table \(T\) —the output of some known or unknown query \(Q\) on \(D\) —the goal of QRE is to reverse-engineer a query \(Q'\) such that the output of query \(Q'\) on database \(D\) (denoted by \(Q'(D)\) ) is equal to \(T\) (i.e., \(Q(D)\) ). The QRE problem has useful applications in database usability, data analysis, and data security. In this work, we propose a data-driven approach, TALOS for Tree-based classifier with At Least One Semantics, that is based on a novel dynamic data classification formulation and extend the approach to efficiently support the three key dimensions of the QRE problem: whether the input query is known/unknown, supporting different query fragments, and supporting multiple database versions.  相似文献   

13.
The Rigorous Examination of Reactive Systems’ (rers) Challenges provide a forum for experimental evaluation based on specifically synthesized benchmark suites. In this paper, we report on our ‘brute-force attack’ of the rers 2012 and 2013 Challenges. We connected the rers problems to two state-of-the-art explicit state model checkers: LTSmin and Spin. Apart from an effective compression of the state vector, we did not analyze the source code of the problems. Our brute-force approach was successful: it won both editions of the rers Challenge.  相似文献   

14.
A procedure is developed for the design of reinforced concrete footings subjected to vertical, concentric column loads that satisfies both structural requirements and geotechnical limit states using a hybrid Big Bang-Big Crunch (BB-BC) algorithm. The objectives of the optimization are to minimize cost, CO $_{2}$ emissions, and the weighted aggregate of cost and CO $_{2}$ . Cost is based on the materials and labor required for the construction of reinforced concrete footings and CO $_{2}$ emissions are associated with the extraction and transportation of raw materials; processing, manufacturing, and fabrication of products; and the emissions of equipment involved in the construction process. The cost and CO $_{2}$ objective functions are based on weighted values and are subjected to bending moment, shear force, and reinforcing details specified by the American Concrete Institute (ACI 318-11), as well as soil bearing and displacement limits. Two sets of design examples are presented: low-cost and low-CO $_{2}$ emission designs based solely on geotechnical considerations; and designs that also satisfy the ACI 318-11 code for structural concrete. A multi-objective optimization is applied to cost and CO $_{2}$ emissions. Results are presented that demonstrate the effects of applied load, soil properties, allowable settlement, and concrete strength on designs.  相似文献   

15.
This paper addresses the problem of estimating the temporal synchronization in mobile sensors’ networks, by using image sequence analysis of their corresponding scene dynamics. Unlike existing methods, which are frequently based on adaptations of techniques originally designed for wired networks with static topologies, or even based on solutions specially designed for ad hoc wireless sensor networks, but that have a high energy consumption and a low scalability regarding the number of sensors, this work proposes a novel approach that reduces the problem of synchronizing a general number $N$ of sensors to the robust estimation of a single line in ${\mathbb {R}}^{N+1}$ . This line captures all temporal relations between the sensors and can be computed without any prior knowledge of these relations. It is assumed that (1) the network’s mobile sensors cross the field of view of a stationary calibrated camera that operates with constant frame rate and (2) the sensors trajectories are estimated with a limited error at a constant sampling rate, both in the world coordinate system and in the camera’s image plane. Experimental results with real-world and synthetic scenarios demonstrate that our method can be successfully used to determine the temporal alignment in mobile sensor networks.  相似文献   

16.
17.
Evolutionary algorithms are widely used to solve multi-objective optimization problems effectively by performing global search over the solution space to find better solutions. Hybrid evolutionary algorithms have been introduced to enhance the quality of solutions obtained. One such hybrid algorithm is memetic algorithm with preferential local search using adaptive weights (MAPLS-AW) (Bhuvana and Aravindan in Soft Comput, doi: 10.1007/s00500-015-1593-9, 2015). MAPLS-AW, a variant of NSGA-II algorithm, recognizes the elite solutions of the population and preferences are given to them for local search during the evolution. This paper proposes a termination scheme derived from the features of MAPLS-AW. The objective of the proposed scheme is to detect convergence of population without compromising quality of solutions generated by MAPLS-AW. The proposed termination scheme consists of five stopping measures, among which two are newly proposed in this paper to predict the convergence of the population. Experimental study has been carried out to analyze the performance of the proposed termination scheme and to compare with existing termination schemes. Several constrained and unconstrained multi-objective benchmark test problems are used for this comparison. Additionally, a real-time application economic emission and load dispatch has also been used to check the performance of the proposed scheme. The results show that the proposed scheme identifies convergence of population much earlier than the existing stopping schemes without compromising the quality of solutions.  相似文献   

18.
An increasing number of methods for background subtraction use Robust PCA to identify sparse foreground objects. While many algorithms use the \(\ell _1\) -norm as a convex relaxation of the ideal sparsifying function, we approach the problem with a smoothed \(\ell _p\) -quasi-norm and present pROST, a method for robust online subspace tracking. The algorithm is based on alternating minimization on manifolds. Implemented on a graphics processing unit, it achieves realtime performance at a resolution of \(160 \times 120\) . Experimental results on a state-of-the-art benchmark for background subtraction on real-world video data indicate that the method succeeds at a broad variety of background subtraction scenarios, and it outperforms competing approaches when video quality is deteriorated by camera jitter.  相似文献   

19.
20.
In this paper we offer an efficient controller synthesis algorithm for assume-guarantee specifications of the form $\varphi _1 \wedge \varphi _2 \wedge \cdots \wedge \varphi _n \rightarrow \psi _1 \wedge \psi _2 \wedge \cdots \wedge \psi _m$ . Here, $\{\varphi _i,\psi _j\}$ are all safety-MTL $_{0, \infty }$ properties, where the sub-formulas $\{\varphi _i\}$ are supposed to specify assumptions of the environment and the sub-formulas $\{\psi _j\}$ are specifying requirements to be guaranteed by the controller. Our synthesis method exploits the engine of Uppaal-Tiga and the novel translation of safety- and co-safety-MTL $_{0, \infty }$ properties into under-approximating, deterministic timed automata. Our approach avoids determinization of Büchi automata, which is the main obstacle for the practical applicability of controller synthesis for linear-time specifications. The experiments demonstrate that the chosen specification formalism is expressive enough to specify complex behaviors. The proposed approach is sound but not complete. However, it successfully produced solutions for all the experiments. Additionally we compared our tool with Acacia+ and Unbeast, state-of-the-art LTL synthesis tools; and our tool demonstrated better timing results, when we applied both tools to the analogous specifications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号