首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The last decade has seen an increasing focus on addressing security already during the earliest stages of system development, such as requirements determination. Attack trees and misuse cases are established techniques for representing security threats along with their potential mitigations. Previous work has compared attack trees and misuse cases in two experiments with students. The present paper instead presents an experiment where industrial practitioners perform the experimental tasks in their workplace. The industrial experiment confirms a central finding from the student experiments: that attack trees tend to help identifying more threats than misuse cases. It also presents a new result: that misuse cases tend to encourage identification of threats associated with earlier development stages than attack trees. The two techniques should therefore be considered complementary and should be used together in practical requirements work.  相似文献   

2.
Progress in microelectronic technology is extremely fast and it is outstripping the designers' abilities to make use of the created opportunities. Development and application of new more suitable design methods and tools is therefore very important for the modern system industry. This paper shows the importance of the AI search techniques for the circuits and systems design space exploration, explains what sorts of search techniques are useful for this aim, and discusses the place, role and way of use of these techniques in circuit and system design. In particular, the paper explains the importance and usage of the heuristic search techniques for the automatic construction and selection of the most promising solutions to the circuit synthesis problems. The discussion and conclusions of the paper are illustrated with examples of three effective and efficient search algorithms, and experimental results from their application to two important circuit synthesis problems. The knowledge presented in the paper combines numerous valuable concepts of modern system engineering and artificial intelligence, and forms a base for further research and application of the AI search techniques to design of complex circuits and systems.  相似文献   

3.
Secure software development should begin at the early stages of the development life cycle. Misuse case modeling is a technique that stems from traditional use case modeling, which facilitates the elicitation and modeling functional security requirements at the requirements phase. Misuse case modeling is an effective vehicle to potentially identify a large subset of these threats. It is therefore crucial to develop high quality misuse case models otherwise end system developed will be vulnerable to security threats. Templates to describe misuse cases are populated with syntax-free natural language content. The inherent ambiguity of syntax-free natural language coupled with the crucial role of misuse case models in development can have a very detrimental effect. This paper proposes a structure that will guide misuse case authors towards developing consistent misuse case models. This paper also presents a process that utilizes this structure to ensure the consistency of misuse case models as they evolve, eliminating potential damages caused by inconsistencies. A tool was developed to provide automation support for the proposed structure and process. The feasibility and application of this approach were demonstrated using two real-world case studies.  相似文献   

4.
Careful simulation-based evaluation plays an important role in the design of file and disk systems. We describe here a particular approach to such evaluations that combines techniques in workload synthesis, file system modeling, and detailed disk behavior modeling. Together, these make feasible the detailed simulation of I/O hardware and file system software. In particular, using the techniques described here is likely to make comparative file system studies more accurate. In addition to these specific contributions, the paper makes two broader points. First, it argues that detailed models are appropriate and necessary in many cases. Second, it demonstrates that detailed models need not be difficult or time consuming to construct or execute.  相似文献   

5.
6.
We address the pose mismatch problem which can occur in face verification systems that have only a single (frontal) face image available for training. In the framework of a Bayesian classifier based on mixtures of gaussians, the problem is tackled through extending each frontal face model with artificially synthesized models for non-frontal views. The synthesis methods are based on several implementations of maximum likelihood linear regression (MLLR), as well as standard multi-variate linear regression (LinReg). All synthesis techniques rely on prior information and learn how face models for the frontal view are related to face models for non-frontal views. The synthesis and extension approach is evaluated by applying it to two face verification systems: a holistic system (based on PCA-derived features) and a local feature system (based on DCT-derived features). Experiments on the FERET database suggest that for the holistic system, the LinReg-based technique is more suited than the MLLR-based techniques; for the local feature system, the results show that synthesis via a new MLLR implementation obtains better performance than synthesis based on traditional MLLR. The results further suggest that extending frontal models considerably reduces errors. It is also shown that the local feature system is less affected by view changes than the holistic system; this can be attributed to the parts based representation of the face, and, due to the classifier based on mixtures of gaussians, the lack of constraints on spatial relations between the face parts, allowing for deformations and movements of face areas.  相似文献   

7.
Applications of linguistic techniques for use case analysis   总被引:2,自引:2,他引:0  
Use cases are effective techniques to express the functional requirements of a system in a very simple and easy-to-learn way. Use cases are mainly composed of natural language (NL) sentences, and the use of NL to describe the behaviour of a system is always a critical point, due to the inherent ambiguities originating from the different possible interpretations of NL sentences. We discuss in this paper the application of analysis techniques based on a linguistic approach to detect, within requirements documents, defects related to such an inherent ambiguity. Starting from the proposed analysis techniques, we will define some metrics that will be used to perform a quality evaluation of requirements documents. Some available automatic tools supporting the linguistic analysis of NL requirements have been used to evaluate an industrial use cases document according to the defined metrics. A discussion on the application of linguistic analysis techniques to support the semantic analysis of use cases is also reported.  相似文献   

8.
This paper presents an introduction to and a formal connection between synthesis problems for discrete event systems that have been considered, largely separately, in the two research communities of supervisory control in control engineering and reactive synthesis in computer science. By making this connection mathematically precise in a paper that attempts to be as self-contained as possible, we wish to introduce these two research areas to non-expert readers and at the same time to highlight how they can be bridged in the context of classical synthesis problems. After presenting general introductions to supervisory control theory and reactive synthesis, we provide a novel reduction of the basic supervisory control problem, non-blocking case, to a problem of reactive synthesis with plants and with a maximal permissiveness requirement. The reduction is for fully-observed systems that are controlled by a single supervisor/controller. It complements prior work that has explored problems at the interface of supervisory control and reactive synthesis. The formal bridge constructed in this paper should be a source of inspiration for new lines of investigation that will leverage the power of the synthesis techniques that have been developed in these two areas.  相似文献   

9.
10.
11.
International Journal on Software Tools for Technology Transfer - Runtime enforcement and control system synthesis are two verification techniques that automate the process of transforming an...  相似文献   

12.
The design of safe industrial controllers is one of the most important domains related to Automation Systems research. To support it, synthesis and analysis techniques are available. Among the analysis techniques, two of the most important are Simulation and Formal Verification. In this paper these two techniques are used together in a complementary way. Understanding plant behaviour is essential for obtaining safe industrial systems controllers; hence, plant modelling is crucial to the success of these techniques. A two step approach is presented: first, the use of Simulation and, second, the use of Formal Verification of Industrial Systems Specifications. The specification and plant models used for each technique are described. Simulation and Formal Verification results are presented and discussed. The approach presented in the paper can be applied to real industrial systems, and obtain safe controllers for hybrid plants. The Modelica modelling language and Dymola simulation environment are used for Simulation purposes, and Timed Automata formalism and the UPPAAL real-time model-checker are used for Formal Verification purposes.  相似文献   

13.
Several approaches have been proposed for the transition from functional requirements to object-oriented design. In a use case-driven development process, the use cases are important input for the identification of classes and their methods. There is, however, no established, empirically validated technique for the transition from use cases to class diagrams. One recommended technique is to derive classes by analyzing the use cases. It has, nevertheless, been reported that this technique leads to problems, such as the developers missing requirements and mistaking requirements for design. An alternative technique is to identify classes from a textual requirements specification and subsequently apply the use case model to validate the resulting class diagram. This paper describes two controlled experiments conducted to investigate these two approaches to applying use case models in an object-oriented design process. The first experiment was conducted with 53 students as subjects. Half of the subjects used a professional modelling tool; the other half used pen and paper. The second experiment was conducted with 22 professional software developers as subjects, all of whom used one of several modelling tools. The first experiment showed that applying use cases to validate class diagrams constructed from textual requirements led to more complete class diagrams than did the derivation of classes from a use case model. In the second experiment, however, we found no such difference between the two techniques. In both experiments, deriving class diagrams from the use cases led to a better structure of the class diagrams. The results of the experiments therefore show that the technique chosen for the transition from use cases to class diagrams affects the quality of the class diagrams, but also that the effects of the techniques depend on the categories of developer applying it and on the tool with which the technique is applied.  相似文献   

14.
Large IT systems are often acquired in a tender process where the customer states the system requirements, a number of suppliers submit their proposals, and the customer selects one of them. Usually the supplier uses an existing system as the basis for his proposal. He adapts it more or less to the customer’s requirements. This paper is a study of a specific tender process. The customer was a Danish municipality that supplied electrical power, water, gas, garbage collection, etc. for around 100,000 households. The customer wanted a new system for meter inspection, invoicing, planning the meter inspector’s routes, etc. We have studied the requirements, how they were perceived by the suppliers, and how they were intended by the customer. The main findings are that the parties didn’t understand each other, although the suppliers sometimes pretended that they did so. One consequence was that the business goals pursued by the customer were not properly achieved. Among the causes of this were an excessively democratic elicitation process and an inadequate use of requirement techniques, particularly use cases. There were also issues that the existing requirement techniques couldn’t deal with, for instance integration with future systems.  相似文献   

15.
This paper describes a speaker-adaptive HMM-based speech synthesis system. The new system, called “HTS-2007,” employs speaker adaptation (CSMAPLR+MAP), feature-space adaptive training, mixed-gender modeling, and full-covariance modeling using CSMAPLR transforms, in addition to several other techniques that have proved effective in our previous systems. Subjective evaluation results show that the new system generates significantly better quality synthetic speech than speaker-dependent approaches with realistic amounts of speech data, and that it bears comparison with speaker-dependent approaches even when large amounts of speech data are available. In addition, a comparison study with several speech synthesis techniques shows the new system is very robust: It is able to build voices from less-than-ideal speech data and synthesize good-quality speech even for out-of-domain sentences.   相似文献   

16.
There are several systems which provide computer support to legal decisions. Perhaps the most significant ones, besides various computerised systems for administration, are information retrieval systems that locate statutes and documents. Other research projects, however, deal with legislation and adjudication, making it possible to use information techniques in making legal decisions. I wish to describe two decision-support programs and to link them to some theoretical findings of my former researches. What connects those programs is that they give some new information for decisions on the basis of previous similar legal cases; both describe cases with the help of criteria and use diverse artificial intelligence methods for different types or criteria. The first of the two programs aims to support decisions of insurance specialists by assessing the measure of the compensation for immaterial damage. The result is given by the combination of a neural network, based upon previous judicial cases, and an expert system. The neural network gives the first assessment for the sum of compensation while the expert system refines the network's output. The other program can be used by judges and lawyers in the course of preparing a decision. Studying cases of road accidents, we find that fuzzy logic methods can help to approximate decisions actually given by judges. In this way, the process of decision making by courts and lawyers receives an additional piece of information, obtained by comparing the seriousness of the actual case with that of previous cases.  相似文献   

17.
This paper describes ‘Goal Structuring Notation’ (GSN), a graphical notation that can be used to structure and present an argument justifying some aspect of system performance. The design of a fault-detecting processor pair is examined to determine the extent to which it is indeed ‘fault-detecting’. It is argued that for complex systems, difficulties with assessment arise not so much from a lack of analysis techniques, but from the need to integrate the results from many diverse analyses into a coherent and compelling argument. It is suggested that GSN provides a framework in which such an argument can be made.  相似文献   

18.
Selecting good views of high-dimensional data using class consistency   总被引:2,自引:0,他引:2  
Many visualization techniques involve mapping high-dimensional data spaces to lower-dimensional views. Unfortunately, mapping a high-dimensional data space into a scatterplot involves a loss of information; or, even worse, it can give a misleading picture of valuable structure in higher dimensions. In this paper, we propose class consistency as a measure of the quality of the mapping. Class consistency enforces the constraint that classes of n–D data are shown clearly in 2–D scatterplots. We propose two quantitative measures of class consistency, one based on the distance to the class's center of gravity, and another based on the entropies of the spatial distributions of classes. We performed an experiment where users choose good views, and show that class consistency has good precision and recall. We also evaluate both consistency measures over a range of data sets and show that these measures are efficient and robust.  相似文献   

19.
Petri nets (PNs) are frequently used in modeling, designing, and analyzing concurrent systems. A problem with PNs, in the general case, is that they require high computational complexity to analyze their properties, such as reachability, liveness, and boundedness. To avoid this problem, synthesis techniques for constructing large PNs are presented. Using these techniques, the behavior of the constructed PN can be determined by local analysis that uses known properties of the given nets. Thus, the high computational complexity of global analysis is bypassed. A synthesis technique that explores dependency relations in PNs is presented. It synthesizes large PNs by combining smaller PNs of arbitrary topology structures, and the combination is verified efficiently by dependency analysis. A large system based on a PN can be built up by repeated applications of the technique  相似文献   

20.
In this paper, new techniques for deformed image motion estimation and compensation using variable-size block-matching are proposed, which can be applied to an image sequence compression system or a moving object recognition system. The motion estimation and compensation techniques have been successfully applied in the area of image sequence coding. Many research papers on improving the performance of these techniques have been published; many directions are proposed, which can all lead to better performance than the conventional techniques. Among them, both generalized block-matching and variable-size block-matching are successfully applied in reducing the data rate of compensation error and motion information, respectively. These two algorithms have their merits, but suffer from their drawbacks. Moreover, reducing the data rate in compensation error is sometimes increasing the data rate in motion information, or vice versa. Based on these two algorithms, we propose and examine several algorithms which are effective in reducing the data rate. We then incorporate these algorithms into a system, in which they work together to overcome the disadvantages to individual and keep their merits at the same time. The proposed system can optimally balance the amount of data rate in two aspects (i.e., compensation error and motion information). Experimental results show that the proposed system outweighs the conventional techniques. Since we propose a recovery operation which tries to recover the incorrect motion vectors from the global motion, this proposed system can also be applied to the moving object recognition in image sequences.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号