首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Formal verification is becoming more and more important in the field of wireless networks (WSN). The general purpose formal method called Event-B is the latest incarnation of the B Method: it is a proof based approach with a formal notation and refinement technique for modeling and verifying systems. Refinement enables implementation level features to be proven correct with respect to an abstract specification of the system. This paper proposes an initial attempt to model and verify consistency and correctness of a WSN operation in its different layers. Several formal models are introduced for this type of networks. In the first time, coloured Petri net are used to elaborate network layer models, then each one will be detailed by an Event-B formalism, while proofs are carried out using the RODIN platform which is an integrated development framework for Event-B.

  相似文献   

2.
Analog and Mixed Signal (AMS) designs can be formally modeled as hybrid systems [45] and therefore formal verification techniques applicable to hybrid systems can be deployed to verify them. An extension to a formal verification approach applicable to hybrid systems is proposed to verify AMS designs [31]. In this approach formal verification (FV) is carried out on an AMS block using simulation traces from SPICE, a simulator widely used in the design and verification of analog and AMS blocks. A broader implication of this approach is the ability to carry out hierarchical verification using relevant simulation traces obtained at different abstraction levels of a design when modeled in appropriate platforms. This enables a seamless transition of design and verification artifacts from the highest level of abstraction to the lowest level of implementation at the transistor level of any AMS design and a resulting increase in confidence on the correctness of the final implementation. The proposed approach has been justified with its applications to different AMS design blocks. For each design, its formal model and the proposed computational techniques have been incorporated into CheckMate [11] - a FV tool for hybrid systems based on MATLAB and the Simulink/Stateflow framework from MathWorks. A further justification of the proposed approach is the resulting improvements observed in terms of reduced verification time for different specifications in each design.  相似文献   

3.
It has been advocated by many experts in design verification that the key to successful verification convergence lies in developing the verification plan with adequate formal rigor. Traditionally, the verification plans for simulation and formal property verification (FPV) are developed in different ways, using different formalisms, and with different coverage goals. In this paper, we propose a framework where the difference between formal properties and simulation test points is diluted by using methods for translating one form of the specification to the other. This allows us to reuse simulation coverage to facilitate formal verification and to reuse proven formal properties to cover simulation test points. We also propose the use of inline assertions in procedural (possibly randomized) test benches, and show that it facilitates the use of hybrid verification techniques between simulation and bounded model checking. We propose the use of promising combinations of formal methods presented in our earlier papers to shape a hierarchical verification flow where simulation and formal methods aim to cover a common design intent specification. The proposed flow is demonstrated using a detailed case study of the ARM AMBA verification benchmark. We believe that the methods presented in this work will stimulate new thought processes and ultimately lead to wider adoption of cohesive coverage management techniques in the design intent validation flow.  相似文献   

4.
5.
6.
在分析通用软件形式化验证方法的基础上,这里设计提出了一种专门针对密码软件安全性的形式化验证方法。该方法采用ACSL(ANSI/ISO C Specification Language)语言对密码软件的安全性进行形式化描述,并采用自动证明与辅助证明相结合的方法,能够对软件的实现是否满足了对安全性至关重要的一些密码学特性进行有效验证。还以一个开源openssl实现中RC4算法的软件实现部分为例,给出了对其保险性进行验证的过程与步骤,结果表明了该方法的有效性。  相似文献   

7.
ElectroCardioGram(ECG) bio-sensors are used to detect and transmit the electrical activity of the heart over a period of time. The ECGs are treated by a signal processing unit inside the bio-sensor to detect and identify heart related medical conditions. Testing and verification of bio-sensor is of paramount importance as faults in these critical systems may lead to loss of life. However, due to the complexity of these systems traditional testing methods might not be adequate to provide high level of assurance about the correctness of their operation. In this paper, we propose a verification framework where we utilize simulation and formal methods to validate the correctness of the ECG bio-sensor. The framework is used to verify ECG bio-sensor at several levels of abstractions, including the signal processing level, where the ECG signal attributes, signal processing algorithm including its filters, detection and analysis algorithms, and triggering the alarm are modeled.  相似文献   

8.
In the past years, much of the research into hardware reverse engineering has focused on the abstraction of gate level netlists to a human readable form. However, none of the proposed methods consider a realistic reverse engineering scenario, where the netlist is physically extracted from a chip. This paper analyzes the impact of errors caused by this extraction and the later partitioning of the netlist on the ability to identify the functionality. Current formal verification based methods which compare against golden models are incapable of dealing with such erroneous netlists. Two methods focusing on the idea that structural similarity implies functional similarity solve this problem: The first new approach uses fuzzy structural similarity matching to compare the structural characteristics of an unknown design against designs in a golden model library. The second new approach proposes a method for inexact graph matching using fuzzy graph isomorphisms, based on the functionalities of gates used within the design. In addition, past attacks on obfuscation methods such as logic locking have required access to an activated chip to compare the obfuscated netlist to a functionally equivalent model. The proposed methods can also find a golden model without the need of an activated chip, so that attacks can occur even before production and activation of the chip. Experiments show that for simple logic locking the approaches identify a suitable golden model in more than 80% of all cases. For realistic error percentages, both approaches can match more than 90% of designs correctly. This is an important first step for hardware reverse engineering methods beyond formal verification based equivalence matching.  相似文献   

9.
董杨鑫  郑建宏 《电子质量》2007,22(10):53-56
验证在SoC设计过程中有十分重要的作用,它将影响到芯片的整体开销和质量.本文首先介绍了当前业界比较常用的一些验证技术的特点,包括仿真技术、静态验证技术、形式验证、物理验证等,然后通过实例论述在SoC设计验证中的关键技术--重用技术、随机约束验证、自检技术和形式断言验证.  相似文献   

10.
We describe a compositional framework, together with its supporting toolset, for hardware/software co-design. Our framework is an integration of a formal approach within a traditional design flow. The formal approach is based on Interval Temporal Logic and its executable subset, Tempura. Refinement is the key element in our framework because it will derivefrom a single formal specification of the system the software and hardware parts of the implementation, while preserving all properties of the system specification. During refinement simulation is used to choose the appropriate refinement rules, which are applied automatically in the HOL system. The framework is illustrated with two case studies. The work presented is part of a UK collaborative research project between the Software Technology Research Laboratory at the De Montfort University and the Oxford University Computing Laboratory.  相似文献   

11.
The advent of new 65 nm/90 nm VLSI technology and SoC design methodologies has brought an explosive growth in the complexity of modern electronic circuits. As a result, functional verification has become the major bottleneck in any digital design flow. Thus, new methods for easier, faster and more reusable verification are required. This paper proposes a verification methodology (VeriSC2) that guides the implementation of working testbenches during hierarchical decomposition and refinement of the design, even before the RTL implementation starts. This approach uses the SystemC Verification Library (SCV), in a tool capable of automatically generating testbench templates. A case study from a MPEG-4 decoder design is used to show the effectiveness of this approach.  相似文献   

12.
13.
14.
15.
The scope of e-business standards today goes beyond the level of infrastructure and incorporates broader issues concerning business processes. e-Business standards are increasingly developed via open initiatives, in order to counter lethargic, formal standardization efforts. While previous research has identified factors that motivate industry standards, little is known about the processes by which a standard emerges. Using an inductive, process-centric research approach we examine the process of standards development for the ebXML standard. Results indicate that standards development involves requirements analysis, design, internal validation, and external validation; however, the sequence of activities differs between business-process-focused standards and technology-focused standards. For business-process-focused standards the inherent uncertainty causes the process to iterate and focus on requirements refinement; technology focused standards exhibit a higher level of structure and tend to have lower levels of iteration that are design focused. These findings allow us to align standards development and strategic decision making, a departure from the typical "development" orientation used in the standards literature. The finding especially holds out for business-process- centric standards. Finally, this study reveals the importance of "openness" as a contributor to iterations evidenced in the process patterns related to standards development.  相似文献   

16.
Programs that implement computer communications protocols can exhibit extremely complicated behavior, and neither informal reasoning nor testing is reliable enough to establish their correctness. In this paper we discuss the application of modular program verification techniques to protocols. This approach is more reliable than informal reasoning, but has an advantage over formal reasoning based on finite-state models, the complexity of the proof need not grow unmanageably as the size of the program increases. Certain tools of concurrent program verification that are especially useful for protocols are presented, history variables that record sequences of input and output values, temporal logic for expressing properties that must hold in a future system state such as eventual receipt of a message), and module specification and composition rules. The use of these techniques is illustrated by verifying two data transfer protocols from the literature: the alternating bit protocol and a protocol proposed by Stenning.  相似文献   

17.
Analyzing encryption protocols using formal verification techniques   总被引:6,自引:0,他引:6  
An approach to analyzing encryption protocols using machine-aided formal verification techniques is presented. The properties that the protocol should preserve are expressed as state invariants, and the theorems that must be proved to guarantee that the cryptographic facility satisfies the invariants are automatically generated by the verification system. A formal specification of an example system is presented, and several weaknesses that were revealed by attempting to verify and test the specification formally are discussed.<>  相似文献   

18.
19.
In this paper, an approach is introduced based on differential operators to construct wavelet-like basis functions. Given a differential operator L with rational transfer function, elementary building blocks are obtained that are shifted replicates of the Green's function of L. It is shown that these can be used to specify a sequence of embedded spline spaces that admit a hierarchical exponential B-spline representation. The corresponding B-splines are entirely specified by their poles and zeros; they are compactly supported, have an explicit analytical form, and generate multiresolution Riesz bases. Moreover, they satisfy generalized refinement equations with a scale-dependent filter and lead to a representation that is dense in L/sub 2/. This allows us to specify a corresponding family of semi-orthogonal exponential spline wavelets, which provides a major extension of earlier polynomial spline constructions. These wavelets are completely characterized, and it is proven that they satisfy the following remarkable properties: 1) they are orthogonal across scales and generate Riesz bases at each resolution level; 2) they yield unconditional bases of L/sub 2/-either compactly supported (B-spline-type) or with exponential decay (orthogonal or dual-type); 3) they have N vanishing exponential moments, where N is the order of the differential operator; 4) they behave like multiresolution versions of the operator L from which they are derived; and 5) their order of approximation is (N-M), where N and M give the number of poles and zeros, respectively. Last but not least, the new wavelet-like decompositions are as computationally efficient as the classical ones. They are computed using an adapted version of Mallat's filter bank algorithm, where the filters depend on the decomposition level.  相似文献   

20.
Verification of a design, based on model checking, requires the identification of a set of formal properties manually derived from the specification of the design under verification (DUV). Such a set can include too few or too many properties. This paper proposes to use a functional ATPG to identify missing properties and to remove unnecessary ones. In particular, the paper refines, extends, and compares, with other symbolic approaches, a methodology to estimate the completeness of formal properties, which exploits a functional fault model and a functional ATPG. Moreover, the same fault model and ATPG are used to face the opposite problem of identifying useless properties, that is, properties which are in logical consequence. Logical consequence between properties is generally examined by using theorem proving, which may require a large amount of time and space resources. On the contrary, the paper proposes a faster approach which analyzes logical consequence by observing the property capability of revealing functional faults. The joint use of the methodologies allows to optimize the set of properties used for several verification sessions needed to check all design phases of an incremental design flow.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号