In incremental software development (ISD) functionalities are delivered incrementally and requirements keep on evolving across iterations. The requirements evolution involves the addition of new dependencies and conflicts among functional and non-functional requirements along with changes in priorities and dependency weights. This, in turn, demands refactoring the order of development of system components to minimize the impact of these changes. Neglecting the non-functional constraints in the software development process exposes it to risks that may accumulate across several iterations. In this research work, we propose a risk management framework for ISD processes that provides an estimate of risk exposure for the project when functional features are frozen while ignoring the associations with non-functional requirements. Our framework proposes suitable risk reduction strategies that work in tandem with the risk assessment module. We also provide a tool interface for our risk management framework.
This paper presents an automated and compositional procedure to solve the substitutability problem in the context of evolving software systems. Our solution contributes two
techniques for checking correctness of software upgrades: (1) a technique based on simultaneous use of over-and under-approximations
obtained via existential and universal abstractions; (2) a dynamic assume-guarantee reasoning algorithm—previously generated component assumptions are reused and altered on-the-fly to prove
or disprove the global safety properties on the updated system. When upgrades are found to be non-substitutable, our solution
generates constructive feedback to developers showing how to improve the components. The substitutability approach has been
implemented and validated in the ComFoRT reasoning framework, and we report encouraging results on an industrial benchmark.
This is an extended version of a paper, Dynamic Component Substitutability Analysis, published in the Proceedings of the Formal Methods 2005 Conference, Lecture Notes in Computer Science, vol. 3582, by the
same authors. This research was sponsored by the National Science Foundation under grant nos. CNS-0411152, CCF-0429120, CCR-0121547,
and CCR-0098072, the Semiconductor Research Corporation under grant no. TJ-1366, the US Army Research Office under grant no.
DAAD19-01-1-0485, the Office of Naval Research under grant no. N00014-01-1-0796, the ICAST project and the Predictable Assembly
from Certifiable Components (PACC) initiative at the Software Engineering Institute, Carnegie Mellon University. The views
and conclusions contained in this document are those of the authors and should not be interpreted as representing the official
policies, either expressed or implied, of any sponsoring institution, the US government or any other entity. 相似文献
We present a framework, called air, for verifying safety properties of assembly language programs via software model checking. air extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification
of low-level software. By working at the assembly level, air allows verification of programs for which source code is unavailable—such as legacy and COTS software—and programs that use
features—such as pointers, structures, and object-orientation—that are problematic for source-level software verification
tools. In addition, air makes no assumptions about the underlying compiler technology. We have implemented a prototype of air and present encouraging results on several non-trivial examples. 相似文献
This paper aims towards designing a new token-based mutual exclusion algorithm for distributed systems. In some of the earlier work, token based algorithms for mutual exclusion are proposed for the distributed environment assuming inverted tree topology. In a wireless setup, such a stable, hierarchical topology is quite unrealistic due to frequent link failures. The proposed token-based algorithm works for processes with assigned priorities on any directed graph topology with or without cycles. The proposed algorithm, in spite of considering priorities of processes, ensures liveness in terms of token requests from low priority processes. Moreover, the algorithm keeps control message traffic reasonably low. The simulation results exhibit the performance of the proposed algorithm under varied contexts besides presenting a comparative performance with other recent algorithms for mutual exclusion like FAPP (Fairness Algorithm for Priority Process). 相似文献
Standard laboratory ageing methods of bitumen only take into account the effect of thermo-oxidation during the service life of a pavement but the effect of high energy cosmic radiation on site is not simulated in these procedures. The aim of the present work is to compare the laboratory simulated short term bitumen ageing (rolling thin film oven test) with ageing produced by short exposures of bitumen samples to Ultra Violet and gamma radiation. The influence of ageing agents on the thermal properties and rheological performance of the pristine and modified bitumen binders has been evaluated in this study. The thermal behavior of various aged bitumens is characterized by both isothermal as well as non-isothermal thermogravimetric analysis. The thermoanalytic investigations on bituminous samples are carried out to evaluate the thermal stabilities and activation energies of the binders and the life time prediction of the materials is made with the help of the kinetic information. It is found that modified bituminous binders are more resistant to heat and radiation. Different rheological tests are conducted by dynamic shear rheometer to examine the effect of ageing in terms of bitumen oxidation and polymer phase degradation which has a major consequence on high temperature rutting or low temperature cracking. Type of modifier is found to be of decisive importance. Creep and recovery tests show that the structure-time dependency of pristine aged bitumen is influenced much by stress and temperature than in the case of modified aged bitumens. The study has revealed that the elastomeric modifier protects the bituminous binder more than plastic modifier or nano filler. Finally, a fair correlation has been made between standard RTFO ageing and radiation aging. 相似文献
The thermal analysis of the chemical vapor transport (CVT)-grown \(\hbox {CuInSe}_{2}\) single crystals was carried out by recording the thermogravimetric, differential thermogravimetric and differential thermal analysis curves. All the three thermo-curves were recorded simultaneously by thermal analyzer in the temperature range of ambient to 1080 K in inert nitrogen atmosphere. The thermo-curves were recorded for four heating rates of 5 K \(\cdot \,\hbox {min}^{-1}\), 10 K \(\cdot \,\hbox {min}^{-1}\), 15 K \(\cdot \,\hbox {min}^{-1}\) and 20 K \(\cdot \,\hbox {min}^{-1}\). The TG curve analysis showed negligible mass loss in the temperature range of ambient to 600 K, stating the sample material to be thermally stable in this temperature range. Above 601 K to the temperature of 1080 K, the sample showed continuous mass loss. The DTG curves showed two peaks in the temperature range of 601 K to 1080 K. The corresponding DTA showed initial minor exothermic nature followed by endothermic nature up to nearly 750 K and above it showed exothermic nature. The initial exothermic nature is due to absorbed water converting to water vapor, whereas the endothermic nature states the absorption of heat by the sample up to nearly 950 K. Above nearly 950 K the exothermic nature is due to the decomposition of sample material. The absorption of heat in the endothermic region is substantiated by corresponding weight loss in TG. The thermal kinetic parameters of the CVT-grown \(\hbox {CuInSe}_{2}\) single crystals were determined employing the non-mechanistic Kissinger relation. The determined kinetic parameters support the observations of the thermo-curves. 相似文献
We present a framework for model checking concurrent software systems which incorporates both states and events. Contrary
to other state/event approaches, our work also integrates two powerful verification techniques, counterexample-guided abstraction
refinement and compositional reasoning. Our specification language is a state/event extension of linear temporal logic, and
allows us to express many properties of software in a concise and intuitive manner. We show how standard automata-theoretic
LTL model checking algorithms can be ported to our framework at no extra cost, enabling us to directly benefit from the large
body of research on efficient LTL verification.
We also present an algorithm to detect deadlocks in concurrent message-passing programs. Deadlock- freedom is not only an
important and desirable property in its own right, but is also a prerequisite for the soundness of our model checking algorithm.
Even though deadlock is inherently non-compositional and is not preserved by classical abstractions, our iterative algorithm
employs both (non-standard) abstractions and compositional reasoning to alleviate the state-space explosion problem. The resulting
framework differs in key respects from other instances of the counterexample-guided abstraction refinement paradigm found
in the literature.
We have implemented this work in the magic verification tool for concurrent C programs and performed tests on a broad set
of benchmarks. Our experiments show that this new approach not only eases the writing of specifications, but also yields important
gains both in space and in time during verification. In certain cases, we even encountered specifications that could not be
verified using traditional pure event-based or state-based approaches, but became tractable within our state/event framework.
We also recorded substantial reductions in time and memory consumption when performing deadlock-freedom checks with our new
abstractions. Finally, we report two bugs (including a deadlock) in the source code of Micro-C/OS versions 2.0 and 2.7, which
we discovered during our experiments.
This research was sponsored by the National Science Foundation (NSF) under grants no. CCR-9803774 and CCR-0121547, the Office
of Naval Research (ONR) and the Naval Research Laboratory (NRL) under contract no. N00014-01-1-0796, the Army Research Office
(ARO) under contract no. DAAD19-01-1-0485, and was conducted as part of the Predictable Assembly from Certifiable Components
(PACC) project at the Software Engineering Institute (SEI).
This article combines and builds upon the papers (CCO+04) and (CCOS04).
Received December 2004
Revised July 2005
Accepted July 2005 by Eerke A. Boiten, John Derrick, Graeme Smith and Ian Hayes 相似文献
This paper proposes a new method for image binarization that uses an iterative partitioning approach. The proposed method has been tested towards binarization of both document and graphic images. The quantitative comparisons with other standard methods reveal that the proposed approach outperforms existing widely used binarization techniques in terms of accuracy of binarization. The experimental results further establish the superiority of the proposed method, especially for degraded documents and graphic images. The proposed algorithm is suitable for a multi-core processing environment as it can be split into multiple parallel units of executions after the initial partitioning. 相似文献