共查询到20条相似文献,搜索用时 15 毫秒
1.
Slicing is a program analysis technique which can be used for reducing the size of the model and avoid state space explosion in model checking. In this work a static slicing technique is proposed for reducing Rebeca models with respect to a property. For applying the actor-based slicing techniques, the Rebeca control flow graph (RCFG) and the Rebeca dependence graph (RDG) are introduced. We propose two different approaches for constructing the RDG, where each approach can be more effective under certain conditions. As the static slicing usually produces large slices, two other slicing-based reduction techniques, step-wise slicing and bounded slicing, are proposed as simple novel ideas. Step-wise slicing first generates slices that overapproximate the behavior of the original model and then refines it, and bounded slicing is based on the semantics of nondeterministic assignments in Rebeca. We also propose a static slicing algorithm for deadlock detection (in absence of any particular property). The efficiency of these techniques is checked by applying them to several case studies which are included in this paper. Similar techniques can be applied on the other actor-based languages. 相似文献
2.
The actor-based language, Timed Rebeca, was introduced to model distributed and asynchronous systems with timing constraints and message passing communication. A toolset was developed for automated translation of Timed Rebeca models to Erlang. The translated code can be executed using a timed extension of McErlang for model checking and simulation. In this work, we added a new toolset that provides statistical model checking of Timed Rebeca models. Using statistical model checking, we are now able to verify larger models against safety properties compared to McErlang model checking. We examine the typical case studies of elevators and ticket service to show the efficiency of statistical model checking and applicability of our toolset. 相似文献
3.
Monojit Choudhury Rahul Saraf Vijit Jain Animesh Mukherjee Sudeshna Sarkar Anupam Basu 《International Journal on Document Analysis and Recognition》2007,10(3-4):157-174
Language usage over computer mediated discourses, such as chats, emails and SMS texts, significantly differs from the standard
form of the language and is referred to as texting language (TL). The presence of intentional misspellings significantly decrease
the accuracy of existing spell checking techniques for TL words. In this work, we formally investigate the nature and type
of compressions used in SMS texts, and develop a Hidden Markov Model based word-model for TL. The model parameters have been
estimated through standard machine learning techniques from a word-aligned SMS and standard English parallel corpus. The accuracy
of the model in correcting TL words is 57.7%, which is almost a threefold improvement over the performance of Aspell. The
use of simple bigram language model results in a 35% reduction of the relative word level error rates. 相似文献
4.
Software product line engineering enables proactive reuse among a set of related products through explicit modeling of commonalities and differences among them. Software product lines are intended to be used in a long period of time. As a result, they evolve over time, due to the changes in the requirements. Having several individual products in a software family, verification of the entire family may take a considerable effort. In this paper we aim to decrease this cost by reducing the number of verified products using static analysis techniques. Furthermore, to reduce model checking costs after product line evolution, we restrict the number of products that should be re-verified by reusing the previous verification result. All proposed techniques are based on static analysis of the product family model with respect to the property and can be automated. To show the effectiveness of these techniques we apply them on a set of case studies and present the results. 相似文献
5.
Cristian I. Pinzón Juan F. De Paz Martí Navarro Javier Bajo Vicente Julián Juan. M. CorchadoAuthor vitae 《Applied Soft Computing》2011,11(7):4384-4398
Security is a major concern when service environments are implemented. This has led to the proposal of a variety of specifications and proposals based on soft computing methods to provide the necessary security for these environments. However, most proposed approaches focus only on ensuring confidentiality and integrity, without putting forward mechanisms that ensure the availability of services and resources offered. A considerable number of attack mechanisms can lead to a web service system crash. As a result, the web service cannot allow access to authorized users. This type of attack is a so-called denial of service attack (DoS) which affects the availability of the services and recourses available. This article presents a novel soft computing-based approach to cope with DoS attacks, but unlike existing solutions, our proposal takes into account the different soft computing mechanisms that can lead to a DoS attack. Our approach is based on a real time classifier agent that incorporates a mixture of experts to choose a specific classification technique depending on the feature of the attack and the time available to solve the classification. With this scheme it is possible to divide the problem into subproblems, solving the classification of the web service requests in a more simple and effective way and always within a time bound interval. This research presents a case study to evaluate the effectiveness of the approach and also presents the preliminary results obtained with an initial prototype. 相似文献
6.
7.
Edmund Clarke Somesh Jha Will Marrero 《International Journal on Software Tools for Technology Transfer (STTT)》2003,4(2):173-188
In this paper we explore how partial-order reduction can make the task of verifying security protocols more efficient. These
reduction techniques have been implemented in our tool Brutus. Partial-order reductions have proved very useful in the domain
of model checking reactive systems. These reductions are not directly applicable in our context because of additional complications
caused by tracking knowledge of various agents. We present partial-order reductions in the context of verifying security protocols
and prove their correctness. Experimental results demonstrating the effectiveness of this reduction technique are also presented.
Published online: 24 January 2003 相似文献
8.
9.
Viktor Schuppan Armin Biere 《International Journal on Software Tools for Technology Transfer (STTT)》2004,5(2-3):185-204
Two types of temporal properties are usually distinguished: safety and liveness. Recently we have shown how to verify liveness properties of finite state systems using safety checking. In this article we extend the translation scheme to typical combinations of temporal operators. We discuss optimizations that limit the overhead of our translation. Using the notions of predicated diameter and radius we obtain revised bounds for our translation scheme. These notions also give a tight bound on the minimal completeness bound for simple liveness properties. Experimental results show the feasibility of the approach for complex examples. For one example, even an exponential speedup can be observed. 相似文献
10.
11.
Giuseppe Della Penna Benedetto Intrigila Igor Melatti Enrico Tronci Marisa Venturini Zilli 《International Journal on Software Tools for Technology Transfer (STTT)》2006,8(4-5):397-409
In this paper we present an explicit disk-based verification algorithm for Probabilistic Systems defining discrete time/finite state Markov Chains. Given a Markov Chain and an integer k (horizon), our algorithm checks whether the probability of reaching an error state in at most k steps is below a given threshold. We present an implementation of our algorithm within a suitable extension of the Murϕ verifier.
We call the resulting probabilistic model checker FHP-Murϕ (Finite Horizon ProbabilisticMurϕ). We present experimental results comparing FHP-Murϕ with (a finite horizon subset of) PRISM, a state-of-the-art symbolic
model checker for Markov Chains. Our experimental results show that FHP-Murϕ can handle systems that are out of reach for
PRISM, namely those involving arithmetic operations on the state variables (e.g. hybrid systems).
This research has been partially supported by MURST projects MEFISTO and SAHARA.
This paper is a journal version of the conference paper [16]. 相似文献
12.
In this paper, we use a hill-climbing attack algorithm based on Bayesian adaption to test the vulnerability of two face recognition systems to indirect attacks. The attacking technique uses the scores provided by the matcher to adapt a global distribution computed from an independent set of users, to the local specificities of the client being attacked. The proposed attack is evaluated on an eigenface-based and a parts-based face verification system using the XM2VTS database. Experimental results demonstrate that the hill-climbing algorithm is very efficient and is able to bypass over 85% of the attacked accounts (for both face recognition systems). The security flaws of the analyzed systems are pointed out and possible countermeasures to avoid them are also proposed. 相似文献
13.
Radu MateescuPedro T. Monteiro Estelle DumasHidde de Jong 《Theoretical computer science》2011,412(26):2854-2883
Model checking has proven to be a useful analysis technique not only for concurrent systems, but also for genetic regulatory networks (Grns). Applications of model checking in systems biology have revealed that temporal logics should be able to capture both branching-time and fairness properties (needed for specifying multistability and oscillation properties, respectively). At the same time, they should have a user-friendly syntax easy to employ by non-experts. In this paper, we define Computation Tree Regular Logic (Ctrl), an extension of Ctl with regular expressions and fairness operators that attempts to match these criteria. Ctrl subsumes both Ctl and Ltl, and has a reduced set of temporal operators indexed by regular expressions. We also develop a translation of Ctrl into Hennessy-Milner Logic with Recursion (HmlR), an equational variant of the modal μ-calculus. This has allowed us to obtain an on-the-fly model checker with diagnostic for Ctrl by directly reusing the verification technology available in the Cadp toolbox. We illustrate the application of the Ctrl model checker by analyzing the Grn controlling the carbon starvation response of Escherichia coli. 相似文献
14.
15.
This paper describes the application of two abstraction techniques, namely dead variable reduction and path reduction, to the microcontroller binary code in order to tackle the state-explosion problem in model checking. These abstraction techniques are based on static analyses, which have to cope with the peculiarities of the binary code such as hardware dependencies, interrupts, recursion, and globally accessible memory locations. An interprocedural static analysis framework is presented that handles these peculiarities. Based on this framework, extensions of dead variable reduction and path reduction are detailed. A case study using several microcontroller programs is presented in order to demonstrate the efficiency of the described abstraction techniques. 相似文献
16.
随着社会的发展变化,语言生活也在不断地发展变化。为了切实掌握中小学维吾尔语文教材中用词情况,以中小学维吾尔语文教材作为研究对象,对用词概况进行研究。陈述研究使用的语料;介绍统计系统研究概况;中小学维吾尔语文教材用词研究包括研究总词次、总词种数、总词干种数;讨论与分析词频与词种、词种覆盖率、词种。 相似文献
17.
The increasing complexity of information and telecommunications systems and networks is reaching a level beyond human ability, mainly from the security assessment viewpoint. Methodologies currently proposed for managing and assuring security requirements fall short of industrial and societal expectations. The statistics about vulnerabilities and attacks show that the security, reliability and availability objectives are not reached and that the general threat situation is getting worse. With the deployment of Next Generation Networks – NGNs, the complexity of networks, considering their architecture, speed and amount of connections, will increase exponentially. There are several proposals for the network and security architectures of NGNs, but current vulnerability, threat and risk analysis methods do not appear adequate to evaluate them. Appropriate analysis methods should have some additional new characteristics, mainly regarding their adaptation to the continuous evolution of the NGNs. In addition, the application of security countermeasures will require technological improvements, which will demand further security analyses. This paper evaluates the current vulnerability, threat and risk analysis methods from the point of view of the new security requirements of NGNs. Then, the paper proposes to use autonomic and self-adaptive systems/applications for assuring the security of NGNs. 相似文献
18.
Unified modeling language (UML) is the standard modeling language for object-oriented system development. Despite its status
as a standard, UML has a fuzzy formal specification and a weak theoretical foundation. Semiotics, the study of signs, provides
a good theoretical foundation for UML research because graphical notations (or visual signs) of UML are subjected to the principles
of signs. In our research, we use semiotics to study the effectiveness of graphical notations in UML. We hypothesized that
the use of iconic signs as UML graphical notations leads to representation that is more accurately interpreted and that arouses
fewer connotations than the use of symbolic signs. An open-ended survey was used to test these hypotheses. The results support
our propositions that iconic UML graphical notations are more accurately interpreted by subjects and that the number of connotations
is lower for iconic UML graphical notations than for symbolic UML graphical notations. The results have both theoretical and
practical significance. This study illustrates the usefulness of using semiotics as a theoretical underpinning in analyzing,
evaluating, and comparing graphical notations for modeling constructs. The results of this research also suggest ways and
means of enhancing the graphical notations of UML modeling constructs. 相似文献
19.
丁德武 《计算机与应用化学》2015,32(3):376-378
中心化分析有助于识别复杂网络中的重要节点,已经被广泛应用于代谢网络研究中。当前,人们已经提出了多种中心化指标,然而如何合理地综合使用它们是一个严峻的挑战。本文使用主成分分析来整合多种中心化方法。首先简单介绍了主成分分析的基本概念及其原理等,随后构造了人类代谢网络的巨强连通成分,并使用10种中心化指标计算了该模型中各代谢物的中心化值作为样本进行主成分分析。最后,我们以第一主成分为例,论证了主成分分析可以合理地整合多种中心化方法用于代谢网络研究。 相似文献
20.
The stochastic dynamics of biochemical reaction networks can be modeled using a number of succinct formalisms all of whose semantics are expressed as Continuous Time Markov Chains (CTMC). While some kinetic parameters for such models can be measured experimentally, most are estimated by either fitting to experimental data or by performing ad hoc, and often manual search procedures. We consider an alternative strategy to the problem, and introduce algorithms for automatically synthesizing the set of all kinetic parameters such that the model satisfies a given high-level behavioral specification. Our algorithms, which integrate statistical model checking and abstraction refinement, can also report the infeasibility of the model if no such combination of parameters exists. Behavioral specifications can be given in any finitely monitorable logic for stochastic systems, including the probabilistic and bounded fragments of linear and metric temporal logics. The correctness of our algorithms is established using a novel combination of arguments based on survey sampling and uniform continuity. We prove that the probability of a measurable set of paths is uniformly and jointly continuous with respect to the kinetic parameters. Under a suitable technical condition, we also show that the unbiased statistical estimator for the probability of a measurable set of paths is monotonic in the parameter space. We apply our algorithms to two benchmark models of biochemical signaling, and demonstrate that they can efficiently find parameter regimes satisfying a given high-level behavioral specification. In particular, we show that our algorithms can synthesize up to 6 parameters, simultaneously, which is more than that reported by any other synthesis algorithm for stochastic systems. Moreover, when parameter estimation is desired, as opposed to synthesis, we show that our approach can scale to even higher dimensional spaces, by identifying the single parameter combination that maximizes the probability of the behavior being true in an 11-dimensional system. 相似文献