首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
3.
We study the security of authenticated encryption based on a stream cipher and a universal hash function. We consider ChaCha20-Poly1305 and generic constructions proposed by Sarkar, where the generic constructions include 14 AEAD (authenticated encryption with associated data) schemes and 3 DAEAD (deterministic AEAD) schemes. In this paper, we analyze the integrity of these schemes both in the standard INT-CTXT (integrity of ciphertext) notion and in the RUP (releasing unverified plaintext) setting called INT-RUP notion. We present INT-CTXT attacks against 3 out of the 14 AEAD schemes and 1 out of the 3 DAEAD schemes. We then show INT-RUP attacks against 1 out of the 14 AEAD schemes and the 2 remaining DAEAD schemes. Next, we consider ChaCha20-Poly1305 and show that it is provably secure in the INT-RUP notion. Finally, we show that the remaining 10 AEAD schemes are provably secure in the INT-RUP notion.  相似文献   

4.
Ontology-based data-centric systems support open-world reasoning. Therefore, for these systems, Web Ontology Language (OWL) and Semantic Web Rule Language (SWRL) are not suitable for expressing integrity constraints based on the closed-world assumption. Thus, the requirement of integrating the open-world assumption of OWL/SWRL with closed-world integrity constraint checking is inevitable. SPARQL, recommended by World Wide Web (W3C), is a query language for RDF graphs, and many research studies have shown that it is a perfect candidate for closed-world constraint checking for ontology-based data-centric applications. In this regard, many research studies have been performed to transform integrity constraints into SPARQL queries where some studies have shown the limitations of partial expressivity of knowledge bases while performing the indirect transformations, whereas others are limited to a platform-specific implementation. To address these issues, this paper presents a flexible and formal methodology that employs Model-Driven Engineering (MDE) to model closed-world integrity constraints for open-world reasoning. The proposed approach offers semantic validation of data by expressing integrity constraints at both the model level and the code level. Moreover, straightforward transformations from OWL/SWRL to SPARQL can be performed. Finally, the methodology is demonstrated via a real-world case study of water observations data.  相似文献   

5.
Semantics preserving SPARQL-to-SQL translation   总被引:2,自引:0,他引:2  
Most existing RDF stores, which serve as metadata repositories on the Semantic Web, use an RDBMS as a backend to manage RDF data. This motivates us to study the problem of translating SPARQL queries into equivalent SQL queries, which further can be optimized and evaluated by the relational query engine and their results can be returned as SPARQL query solutions. The main contributions of our research are: (i) We formalize a relational algebra based semantics of SPARQL, which bridges the gap between SPARQL and SQL query languages, and prove that our semantics is equivalent to the mapping-based semantics of SPARQL; (ii) Based on this semantics, we propose the first provably semantics preserving SPARQL-to-SQL translation for SPARQL triple patterns, basic graph patterns, optional graph patterns, alternative graph patterns, and value constraints; (iii) Our translation algorithm is generic and can be directly applied to existing RDBMS-based RDF stores; and (iv) We outline a number of simplifications for the SPARQL-to-SQL translation to generate simpler and more efficient SQL queries and extend our defined semantics and translation to support the bag semantics of a SPARQL query solution. The experimental study showed that our proposed generic translation can serve as a good alternative to existing schema dependent translations in terms of efficient query evaluation and/or ensured query result correctness.  相似文献   

6.
P2P网络中基于RDF/S的数据库语义查询系统设计*   总被引:1,自引:1,他引:0  
提出一种P2P网络环境下基于RDF/S 数据模型的数据库语义查询原型系统,可以有效地对各种基于ODBC的异构数据源进行语义查询。首先数据内容经RDF/S描述成RDF/S schema片断形式,然后对片段进行编码,再把编码杂凑到DHT中,就可以使用Chord协议定位目标节点。  相似文献   

7.
P2P computing gains increasing attention lately, since it provides the means for realizing computing systems that scale to very large numbers of participating peers, while ensuring high autonomy and fault-tolerance. Peer Data Management Systems (PDMS) have been proposed to support sophisticated facilities in exchanging, querying and integrating (semi-)structured data hosted by peers. In this paper, we are interested in routing graph queries in a very large PDMS, where peers advertise their local bases using fragments of community RDF/S schemes (i.e., views). We introduce an original encoding for these fragments, in order to efficiently check whether a peer view is subsumed by a query. We rely on this encoding to design an RDF/S view lookup service featuring a statefull and a stateless execution over a DHT-based P2P infrastructure. We finally evaluate experimentally our system to demonstrate its scalability for very large P2P networks and arbitrary RDF/S schema fragments, and to estimate the number of routing hops required by the two versions of our lookup service. Work done when T. Dalamagas was a postdoc researcher in NTUA.  相似文献   

8.
9.
The notion of context appears in computer science, as well as in several other disciplines, in various forms. In this paper, we present a general framework for representing the notion of context in information modeling. First, we define a context as a set of objects, within which each object has a set of names and possibly a reference: the reference of the object is another context which “hides” detailed information about the object. Then, we introduce the possibility of structuring the contents of a context through the traditional abstraction mechanisms, i.e., classification, generalization, and attribution. We show that, depending on the application, our notion of context can be used as an independent abstraction mechanism, either in an alternative or a complementary capacity with respect to the traditional abstraction mechanisms. We also study the interactions between contextualization and the traditional abstraction mechanisms, as well as the constraints that govern such interactions. Finally, we present a theory for contextualized information bases. The theory includes a set of validity constraints, a model theory, as well as a set of sound and complete inference rules. We show that our core theory can be easily extended to support embedding of particular information models in our contextualization framework.  相似文献   

10.
This paper deals with database updates. More precisely we focus on addition and deletion operations, when transition constraints are expressed on the database. In the first section, we present an overview of works in the fields of belief revision, knowledge base and database updates. We claim that database update semantics is a formulas-based (or syntactical) one. Furthermore, we pay attention to the notion of transition constraints, introduced in the database domain many years ago in order to constrain state changes. In the second section, we present the formalism we think necessary to express transition constraints and reason with them. It is a particular modal formalism which allows us to reason with the current state of the database and with its next state as well. In Sections 3 and 4, we intend to characterize the database state that follows from an addition or a deletion, taking transition constraints into account. We give importance to a notion of minimal change which extends the classical notion of minimal change on finite bases. We show that, when no transition constraint is expressed, the semantics we give to the addition (resp.: deletion) is a maxichoice one. We also focus on another particular case of transition constraints which could allow us to computationally generate the next database state. Then we discuss the problem of extending these cases to the general one. © 1994 John Wiley & Sons, Inc.  相似文献   

11.
12.
Although the importance of models continuously grows in software development, common development approaches are less able to integrate the automatic management of model integrity into the development process. These critically important constraints may ensure the coherence of models in the evolution process to prevent manipulations that could violate defined constraints on a model. This paper proposes an integrity framework in the context of model-driven architecture to achieve sufficient structural code coverage at a higher program representation level than machine code. Our framework offers to propagate the modifications from a platform-independent specification to the corresponding test template model while keeping the consistency and integrity constraints after system evolution. To examine the efficiency of the proposed framework, a quantitative analysis plan is evaluated based on two experimental case studies. In addition, we propose coverage criteria for integrity regression testing (IRT), derived from logic coverage criteria that apply different conceptual levels of testing for the formulation of integrity requirements. The defined criteria for IRT reduce the inherent complexity and cost of verifying complex design changes in regression testing while keeping the fault detection capability with respect to the changes. The framework aims to keep pace with IRT in a formal way. The framework can solve a number of restricted outlooks in model integrity and some limiting factors of incremental maintenance and retesting. The framework satisfies several valuable quality attributes in software testing, such as safety percentage, precision, abstract fault detection performance measurable coverage level, and generality.  相似文献   

13.
Over the past few years there has been considerable progress in methods to systematically analyse the complexity of constraint satisfaction problems with specified constraint types. One very powerful theoretical development in this area links the complexity of a set of constraints to a corresponding set of algebraic operations, known as polymorphisms.In this paper we extend the analysis of complexity to the more general framework of combinatorial optimisation problems expressed using various forms of soft constraints. We launch a systematic investigation of the complexity of these problems by extending the notion of a polymorphism to a more general algebraic operation, which we call a multimorphism. We show that many tractable sets of soft constraints, both established and novel, can be characterised by the presence of particular multimorphisms. We also show that a simple set of NP-hard constraints has very restricted multimorphisms. Finally, we use the notion of multimorphism to give a complete classification of complexity for the Boolean case which extends several earlier classification results for particular special cases.  相似文献   

14.
The use of Laplacian eigenfunctions is ubiquitous in a wide range of computer graphics and geometry processing applications. In particular, Laplacian eigenbases allow generalizing the classical Fourier analysis to manifolds. A key drawback of such bases is their inherently global nature, as the Laplacian eigenfunctions carry geometric and topological structure of the entire manifold. In this paper, we introduce a new framework for local spectral shape analysis. We show how to efficiently construct localized orthogonal bases by solving an optimization problem that in turn can be posed as the eigendecomposition of a new operator obtained by a modification of the standard Laplacian. We study the theoretical and computational aspects of the proposed framework and showcase our new construction on the classical problems of shape approximation and correspondence. We obtain significant improvement compared to classical Laplacian eigenbases as well as other alternatives for constructing localized bases.  相似文献   

15.
16.
In this paper, we present a computational framework for automatic generation of provably correct control laws for planar robots in polygonal environments. Using polygon triangulation and discrete abstractions, we map continuous motion planning and control problems, specified in terms of triangles, to computationally inexpensive problems on finite-state-transition systems. In this framework, discrete planning algorithms in complex environments can be seamlessly linked to automatic generation of feedback control laws for robots with underactuation constraints and control bounds. We focus on fully actuated kinematic robots with velocity bounds and (underactuated) unicycles with forward and turning speed bounds.  相似文献   

17.
Software maintenance and evolution is a lengthy and expensive phase in the life cycle of a software system. In this paper we focus on the change propagation problem: given a primary change that is made in order to meet a new or changed requirement, what additional, secondary, changes are needed? We propose a novel, agent-oriented, approach that works by repairing violations of desired consistency rules in a design model. Such consistency constraints are specified using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) metamodel, which form the key inputs to our change propagation framework. The underlying change propagation mechanism of our framework is based on the well-known Belief-Desire-Intention (BDI) agent architecture. Our approach represents change options for repairing inconsistencies using event-triggered plans, as is done in BDI agent platforms. This naturally reflects the cascading nature of change propagation, where each change (primary or secondary) can require further changes to be made. We also propose a new method for generating repair plans from OCL consistency constraints. Furthermore, a given inconsistency will typically have a number of repair plans that could be used to restore consistency, and we propose a mechanism for semi-automatically selecting between alternative repair plans. This mechanism, which is based on a notion of cost, takes into account cascades (where fixing the violation of a constraint breaks another constraint), and synergies between constraints (where fixing the violation of a constraint also fixes another violated constraint). Finally, we report on an evaluation of the approach, covering effectiveness, efficiency and scalability.  相似文献   

18.
Because of limited server and network capacities for streaming applications, multimedia proxies are commonly used to cache multimedia objects such that, by accessing nearby proxies, clients can enjoy a smaller start-up latency and receive a better quality-of-service (QoS) guarantee-for example, reduced packet loss and delay jitters for their requests. However, the use of multimedia proxies increases the risk that multimedia data are exposed to unauthorized access by intruders. In this paper, we present a framework for implementing a secure multimedia proxy system for audio and video streaming applications. The framework employs a notion of asymmetric reversible parametric sequence (ARPS) to provide the following security properties: i) data confidentiality during transmission, ii) end-to-end data confidentiality, iii) data confidentiality against proxy intruders, and iv) data confidentiality against member collusion. Our framework is grounded on a multikey RSA technique such that system resilience against attacks is provably strong given standard computability assumptions. One important feature of our proposed scheme is that clients only need to perform a single decryption operation to recover the original data even though the data packets may have been encrypted by multiple proxies along the delivery path. We also propose the use of a set of encryption configuration parameters (ECP) to trade off proxy encryption throughput against the presentation quality of audio/video obtained by unauthorized parties. Implementation results show that we can simultaneously achieve high encryption throughput and extremely low video quality (in terms of peak signal-to-noise ratio and visual quality of decoded video frames) for unauthorized access.  相似文献   

19.
Until now, quality assessment of requirements has focused on traditional up-front requirements. Contrasting these traditional requirements are just-in-time (JIT) requirements, which are by definition incomplete, not specific and might be ambiguous when initially specified, indicating a different notion of "correctness." We analyze how the assessment of JIT requirements quality should be performed based on the literature of traditional and JIT requirements. Based on that analysis, we have designed a quality framework for JIT requirements and instantiated it for feature requests in open source projects. We also indicate how the framework can be instantiated for other types of JIT requirements. We have performed an initial evaluation of our framework for feature requests with eight practitioners from the Dutch agile community, receiving overall positive feedback. Subsequently, we have used our framework to assess 550 feature requests originating from three Open Source Software systems (Netbeans, ArgoUML and Mylyn Tasks). In doing so, we obtain a view on the feature request quality for the three open source projects. The value of our framework is threefold: (1) it gives an overview of quality criteria that are applicable to feature requests (at creation time or JIT); (2) it serves as a structured basis for teams that need to assess the quality of their JIT requirements; and (3) it provides a way to get an insight into the quality of JIT requirements in existing projects.  相似文献   

20.
We introduce a computable framework for Lebesgue’s measure and integration theory in the spirit of domain theory. For an effectively given second countable locally compact Hausdorff space and an effectively given finite Borel measure on the space, we define a recursive measurable set, which extends the corresponding notion due to S?anin for the Lebesgue measure on the real line. We also introduce the stronger notion of a computable measurable set, where a measurable set is approximated from inside and outside by sequences of closed and open subsets, respectively. The more refined property of computable measurable sets give rise to the idea of partial measurable subsets, which naturally form a domain for measurable subsets. We then introduce interval-valued measurable functions and develop the notion of recursive and computable measurable functions using interval-valued simple functions. This leads us to the interval versions of the main results in classical measure theory. The Lebesgue integral is shown to be a continuous operator on the domain of interval-valued measurable functions and the interval-valued Lebesgue integral provides a computable framework for integration.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号