首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到16条相似文献,搜索用时 15 毫秒
1.
Error flow analysis and testing techniques focus on the introduction of errors through code faults into data states of an executing program, and their subsequent cancellation or propagation to output. The goals and limitations of several error flow techniques are discussed, including mutation analysis, fault-based testing, PIE analysis, and dynamic impact analysis. The attributes desired of a good error flow technique are proposed, and a model called dynamic error flow analysis (DEFA) is described that embodies many of these attributes. A testing strategy is proposed that uses DEFA information to select an optimal set of test paths and to quantify the results of successful testing. An experiment is presented that illustrates this testing strategy. In this experiment, the proposed testing strategy outperforms mutation testing in catching arbitrary data state errors.  相似文献   

2.
An Aspect-Oriented Programming-based approach to the development of software components for fault detection in automatic measurement systems is proposed. Faults are handled by means of specific software units, the “aspects”, in order to better modularize issues transversal to several components. As a case study, this approach was applied to the design of the fault detection software inside a flexible framework for magnetic measurements, developed at the European Organization for Nuclear Research (CERN). Experimental results of software modularity and performance measurements for comparing aspect- and object-oriented solutions in rotating coils tests on superconducting magnets are reported.  相似文献   

3.
The problem of estimating the weight coefficients of attributes used in intelligent decision-support systems is considered. A procedure is proposed to estimate the relative significance of attributes based on special relative attribute-importance measures. The properties of the procedure are proved and illustrative examples are given. The study was sponsored by the Russian Foundation for Basic Research (project No. 07-01-00452, No. 09-01-99014-r_ofi). Translated from Kibernetika i Sistemnyi Analiz, No. 3, pp. 127-135, May–June 2009.  相似文献   

4.
An approach is proposed to the solution of formalized problems of assessment of the activity that produces and maintains software systems (SSs). Such assessment is realized by using expertises that form a new assessment process adequate to the activity needs and specifics with an environment common to the expertises. The following mathematical apparatus is elaborated for expertises: a framework (target functions and executing mechanisms), a model and methods (formalisms for improving the quality and reusing the results of expertises) of an assessment process, and tools for integrating the apparatus into software development management processes. The approach is theoretically justified. Prospects of developing the proposed approach are described. Translated from Kibernetika i Sistemnyi Analiz, No. 4, pp. 151–168, July–August 2009.  相似文献   

5.
In this paper, we present an effective approach for grouping text lines in online handwritten Japanese documents by combining temporal and spatial information. With decision functions optimized by supervised learning, the approach has few artificial parameters and utilizes little prior knowledge. First, the strokes in the document are grouped into text line strings according to off-stroke distances. Each text line string, which may contain multiple lines, is segmented by optimizing a cost function trained by the minimum classification error (MCE) method. At the temporal merge stage, over-segmented text lines (caused by stroke classification errors) are merged with a support vector machine (SVM) classifier for making merge/non-merge decisions. Last, a spatial merge module corrects the segmentation errors caused by delayed strokes. Misclassified text/non-text strokes (stroke type classification precedes text line grouping) can be corrected at the temporal merge stage. To evaluate the performance of text line grouping, we provide a set of performance metrics for evaluating from multiple aspects. In experiments on a large number of free form documents in the Tokyo University of Agriculture and Technology (TUAT) Kondate database, the proposed approach achieves the entity detection metric (EDM) rate of 0.8992 and the edit-distance rate (EDR) of 0.1114. For grouping of pure text strokes, the performance reaches EDM of 0.9591 and EDR of 0.0669.  相似文献   

6.
ContextThe way global software development (GSD) activities are managed impacts knowledge transactions between team members. The first is captured in governance decisions, and the latter in a transactive memory system (TMS), a shared cognitive system for encoding, storing and retrieving knowledge between members of a group.ObjectiveWe seek to identify how different governance decisions (such as business strategy, team configuration, task allocation) affect the structure of transactive memory systems as well as the processes developed within those systems.MethodWe use both a quantitative and a qualitative approach. We collect quantitative data through an online survey to identify transactive memory systems. We analyze transactive memory structures using social network analysis techniques and we build a latent variable model to measure transactive memory processes. We further support and triangulate our results by means of interviews, which also help us examine the GSD governance modes of the participating projects. We analyze governance modes, as set of decisions based on three aspects; business strategy, team structure and composition, and task allocation.ResultsOur results suggest that different governance decisions have a different impact on transactive memory systems. Offshore insourcing as a business strategy, for instance, creates tightly-connected clusters, which in turn leads to better developed transactive memory processes. We also find that within the composition and structure of GSD teams, there are boundary spanners (formal or informal) who have a better overview of the network’s activities and become central members within their network. An interesting mapping between task allocation and the composition of the network core suggests that the way tasks are allocated among distributed teams is an indicator of where expertise resides.ConclusionWe present an analytical method to examine GSD governance decisions and their effect on transactive memory systems. Our method can be used from both practitioners and researchers as a “cause and effect” tool for improving collaboration of global software teams.  相似文献   

7.
An evolutionary approach to deception in multi-agent systems   总被引:1,自引:0,他引:1  
Understanding issues of trust and deception are key to designing robust, reliable multi-agent systems. This paper builds on previous work which examined the use of auctions as a model for exploring the concept of deception in such systems. We have previously described two forms of deceptive behaviour which can occur in a simulated repeated English auction. The first of these types of deception involves sniping or late bidding, which not only allows an agent to conceal its true valuation for an item, but also potentially allows it to win an item for which it may not possess the highest valuation. The second deceptive strategy involves the placing of false bids which are designed to reduce an opponent’s potential profit. In this work we examine the potential shortcomings of those two strategies and investigate whether or not their individual strengths can be combined to produce a successful hybrid deceptive strategy.  相似文献   

8.
Recent breakthroughs in computing technology have created a set of perplexing new problems for information systems (IS) professionals. These revolve around decisions to be made about replacing current systems with newer technology, upgrading existing systems, and migrating to other platforms or environments. Many decision makers must rely on subjective assessments, such as their instincts or the recommendation of vendors rather than on an objective analysis of their information needs and how they can be met by various system alternatives. A model to quantify these issues, providing an objective measure for comparing system alternatives, including migration, would be valuable. Such a model is demonstrated here; it uses the Shannon-Weaver entropy model in conjunction with quality measures to quantify actual and potential system effectiveness.  相似文献   

9.
Esterel is a formally-defined language designed for programming reactive systems; namely, those that maintain a permanent interaction with their environment. The AT&T 5ESS® telephone switching system is an example of a reactive system. We describe an implementation in Esterel of one feature of a 5ESS switch; this implementation has been tested in the 5ESS switch simulator. Furthermore, it has been formally verified that this implementation satisfies some safety properties stated by 5ESS software development. Our experience indicates that Esterel is suitable for programming industrial-strength reactive systems, and affords significant advantages in software development over more traditional programming languages used in industrial settings.An earlier version of this paper appeared in the Proceedings of the Workshop on Industrial-Strength Formal Specification Techniques, Boca Raton, Florida, 1995.The author is currently supported by a Fulbright fellowship from Spain's Ministry of Science and Education. The work described here was performed while the author was visiting AT&T Bell Laboratories.  相似文献   

10.
A large proportion of the requirements on embedded real-time systems stems from the extra-functional dimensions of time and space determinism, dependability, safety and security, and it is addressed at the software level. The adoption of a sound software architecture provides crucial aid in conveniently apportioning the relevant development concerns. This paper takes a software-centered interpretation of the ISO 42010 notion of architecture, enhancing it with a component model that attributes separate concerns to distinct design views. The component boundary becomes the border between functional and extra-functional concerns. The latter are treated as decorations placed on the outside of components, satisfied by implementation artifacts separate from and composable with the implementation of the component internals. The approach was evaluated by industrial users from several domains, with remarkably positive results.  相似文献   

11.
Tracking land cover changes using remotely-sensed data contributes to evaluating to what extent human activities impact the environment. Recent studies have pointed out some limitations of single-date comparisons between years and have emphasized the usefulness of time series. However, less effort has hitherto been dedicated to properly account for the temporal dependences typifying the successive images of a time series. An automated change detection method based on a per-object approach and on a probabilistic procedure is proposed here to better cope with this issue. This innovative procedure is applied to a tropical forest environment using high temporal resolution SPOT-VEGETATION time series from 2001 and 2004 in the Brazilian state of Rondônia. The principle of the method is to identify the objects that most deviate from an unchanged reference defined by objective rules. A probabilistic changed-unchanged threshold provides a change map where each object is associated with a likelihood of having changed. This improvement on a binary diagnostic makes the method relevant to meet the requirements of different users, ranging from a comprehensive detection of changes to a detection of the most dramatic changes. According to the threshold value, overall accuracy indices of up to 91% were obtained, with errors involving change omissions for the most part. The isolation of changes within objects was made possible through a segmentation procedure implemented in a temporal context. In addition, the method was formulated so as to differentiate between inter- and intra-annual vegetation dynamics. These technical peculiarities will likely make this analytical framework suitable for detecting changes in environments subject to a strongly marked phenology.  相似文献   

12.
13.
When building a large and complex system, such as satellites, all sorts of risks have to be managed if it were to be successful. For risks in the design of an artifact, various reliability analysis techniques such as FTA or FMEA have been employed in the engineering domain. However, risks exist as well in the development process, and they could result in a failure of the system. In this paper, we present an approach to discovering risks in development process by collecting and organizing information produced during development process at low cost. We describe a prototype system called IDIMS, and show how it can be used to discover risks from e-mail communications between developers. The motivation of our work is to overcome thecapture bottleneck problem, and utilize now wasted information to improve development process. Yoshikiyo Kato: He received his B. Eng. (1998) and M.Eng. (2000) degrees in aeronautics and astronautics from The University of Tokyo. From September 1998 to July 1999, he was an exchange student at Department of Computer Science and Engineering of University of California, San Diego, and worked on software engineering tools. From May 2001 to July 2002, he was a research assistant at National Institute of Informatics (Japan). He is currently a Ph.D. student at Department of Advanced Interdisciplinary Studies of the University of Tokyo. His research interests include knowledge management, CSCW, HCI and software engineering He is a member of AAAI and JSAI. Takahiro Shirakawa: He received his B.Eng. (2000) and M.Eng. (2002) degrees in aeronautics and astronautics from the University of Tokyo. He is currently an assistant examiner at Japan Patent Office. Kohei Taketa: He received his B.Eng. (2000) and M.Eng. (2002) degrees in aeronautics and astronautics from the University of Tokyo. He is currently a software engineer at NTT Data Corp. Koichi Hori, Dr.Eng.: He received B.Eng, M.Eng, and Dr.Eng. degrees in electronic engineering from the University of Tokyo, Japan, in 1979, 1981, and 1984, respectively. In 1984, he joined National Institute of Japanese Literature where he developed AI systems for literature studies. Since 1988, he has been with the U University of Tokyo. He is currently a professor with Department of Advanced Interdisciplinary Studies, The University of Tokyo. From September 1989 to January 1990, he also held a visiting position at University of Compiegne, France. His current research interests include AI technology for supporting human creative activities, cognitive engineering, and Intelligent CAD systems. He is a member of IEEE, ACM, IEICE, IPSJ, JSAI, JSSST and JCSS.  相似文献   

14.
The problem of evaluating the state of systems that are open with respect to input and output for the purpose of creating a system for evaluating states when testing complex objects is considered. Solving the problem will make it possible to eliminate ambiguity in cases of insufficiency of the output parameters and where uncontrolled parameters exert an influence. It will also reduce the problem to that of successive evaluation of the states of output-open system and, subsequently, of the states of input-open systems. The search for a solution is conducted by means of an analysis of the set of states of a finite-automaton model in the set of output variables. An example illustrating practical implementation of the proposed approach is presented.  相似文献   

15.
Open multi-agent systems (MAS) are decentralised and distributed systems that consist of a large number of loosely coupled autonomous agents. In the absence of centralised control they tend to be difficult to manage, especially in an open environment, which is dynamic, complex, distributed and unpredictable. This dynamism and uncertainty in an open environment gives rise to unexpected plan failures. In this paper we present an abstract knowledge based approach for the diagnosis and recovery of plan action failures. Our approach associates a sentinel agent with each problem solving agent in order to monitor the problem solving agent’s interactions. The proposed approach also requires the problem solving agents to be able to report on the status of a plan’s actions.Once an exception is detected the sentinel agents start an investigation of the suspected agents. The sentinel agents collect information about the status of failed plan abstract actions and knowledge about agents’ mental attitudes regarding any failed plan. The sentinel agent then uses this abstract knowledge and the agents’ mental attitudes, to diagnose the underlying cause of the plan failure. The sentinel agent may ask the problem solving agent to retry their failed plan based on the diagnostic result.  相似文献   

16.
Cellular manufacturing systems (CMS) are used to improve production flexibility and efficiency. They involve the identification of part families and machine cells so that intercellular movement is minimized and the utilization of the machines within a cell is maximized. Previous research has focused mainly on cell formation problems and their variants; however, only few articles have focused on more practical and complicated problems that simultaneously consider the three critical issues in the CMS-design process, i.e., cell formation, cell layout, and intracellular machine sequence. In this study, a two-stage mathematical programming model is formulated to integrate the three critical issues with the consideration of alternative process routings, operation sequences, and production volume. Next, because of the combinatorial nature of the above model, an efficient tabu search algorithm based on a generalized similarity coefficient is proposed. Computational results from test problems show that our proposed model and solution approach are both effective and efficient. When compared to the mathematical programming approach, which takes more than 112 h (LINGO) and 1139 s (CPLEX) to solve a set of ten test instances, the proposed algorithm can produce optimal solutions for the same set of test instances in less than 12 s.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号