首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The Unified Modeling Language (UML) has been widely accepted as a standard for modeling software systems from various perspectives. The intuitive notations of UML diagrams greatly improve the communication among developers. However, the lack of a formal semantics makes it difficult to automate analysis and verification. This paper offers a graphical yet formal approach to specifying the behavioral semantics of statechart diagrams using graph transformation techniques. It supports many advanced features of statecharts, such as composite states, firing priority, history, junction, and choice. In our approach, a graph grammar is derived automatically from a state machine to summarize the hierarchy of states. Based on the graph grammar, the execution of a set of non-conflict state transitions is interpreted by a sequence of graph transformations. This facilitates verifying a design model against system requirements. To demonstrate our approach, we present a case study on a toll-gate system.  相似文献   

2.
In recent years, the influence of design patterns on software quality has attracted an increasing attention in the area of software engineering, as design patterns encapsulate valuable knowledge to resolve design problems, and more importantly to improve design quality. As the paradigm continues to increase in popularity, a systematic and objective approach to verify the design of a pattern is increasingly important. The intent session in a design pattern indicates the problem the design pattern wants to resolve, and the solution session describes the structural model for the problem. When the problem in the intent is a quality problem, the structure model should provide a solution to improve the relevant quality. In this work we provide an approach, based on object-oriented quality model, to validate if a design pattern is well-designed, i.e., it answers the question of the proposed structural model really resolves the quality problems described in the intent. We propose a validation approach to help pattern developers check if a design pattern is well-designed. In addition, a quantitative method is proposed to measure the effectiveness of the quality improvement of a design pattern that pattern users can determine which design patterns are applicable to meet their functional and quality requirements.  相似文献   

3.
In this paper, we describe a technique to design UML-based software models for MPSoC architecture, which focuses on the development of the platform specific model of embedded software. To develop the platform specific model, we define a process for the design of UML-based software model and suggest an algorithm with precise actions to map the model to MPSoC architecture. In order to support our design process, we implemented our approach in an integrated tool. Using the tool, we applied our design technique to a target system. We believe that our technique provides several benefits such as improving parallelism of tasks and fast-and-valid mapping of software models to hardware architecture.  相似文献   

4.
The goal of this paper is to investigate the relation between object-oriented design choices and defects in software systems, with focus on a real-time telecommunication domain. The design choices are measured using the widely accepted metrics suite proposed by Chidamber and Kemerer for object oriented languages [S.R. Chidamber, C.F. Kemerer, A metrics suite for object oriented design, IEEE Transactions on Software Engineering 20 (6) (1994) 476-493].This paper reports the results of an extensive case study, which strongly reinforces earlier, mainly anecdotal, evidence that design aspects related to communication between classes can be used as indicators of the most defect-prone classes.Statistical models applicable for the non-normally distributed count data are used, such as Poisson regression, negative binomial regression, and zero-inflated negative binomial regression. The performances of the models are assessed using correlations, dispersion coefficients and Alberg diagrams.The zero-inflated negative binomial regression model based on response for a class shows the best overall ability to describe the variability of the number of defects in classes.  相似文献   

5.
Packages are important high-level organizational units for large object-oriented systems. Package-level metrics characterize the attributes of packages such as size, complexity, and coupling. There is a need for empirical evidence to support the collection of these metrics and using them as early indicators of some important external software quality attributes. In this paper, three suites of package-level metrics (Martin, MOOD and CK) are evaluated and compared empirically in predicting the number of pre-release faults and the number of post-release faults in packages. Eclipse, one of the largest open source systems, is used as a case study. The results indicate that the prediction models that are based on Martin suite are more accurate than those that are based on MOOD and CK suites across releases of Eclipse.  相似文献   

6.
This paper presents a quantitative framework for early prediction of resource usage and load in distributed real-time systems (DRTS). The prediction is based on an analysis of UML 2.0 sequence diagrams, augmented with timing information, to extract timed-control flow information. It is aimed at improving the early predictability of a DRTS by offering a systematic approach to predict, at the design phase, system behavior in each time instant during its execution. Since behavioral models such as sequence diagrams are available in early design phases of the software life cycle, the framework enables resource analysis at a stage when design decisions are still easy to change. Though we provide a general framework, we use network traffic as an example resource type to illustrate how the approach is applied. We also indicate how usage and load analysis of other types of resources (e.g., CPU and memory) can be performed in a similar fashion. A case study illustrates the feasibility of the approach.
Yvan LabicheEmail:
  相似文献   

7.
本文提出了一种为分布式应用动态构造依赖性模型的方法。这个方法通过对系统进行主动干扰来获得建模的先验知识,然后基于贝叶斯网络构造方法,对分布式应用的组件间关系建立依赖性模型。和传统的被动建模技术不同的是,这种主动方法不需要事先对系统细节充分了解,它通过在运行环境中部署探针,捕捉和测量与部署的主动干扰相关的系统反馈,通过机器学习的方法识别分布式应用中构件间的动态调用的依赖关系,为分布式应用建立动态运行过程中的依赖性模型。动态建立的依赖性模型可用于分布式应用的运行时管理,用于分布式应用执行过程中的故障定位和恢复,对于分布式应用自主计算环境的实现,提供一种实用的方法。  相似文献   

8.
实时系统动态行为模型的一种形式分析方法*   总被引:1,自引:0,他引:1  
戎玫 《计算机应用研究》2009,26(9):3365-3368
提出了一种基于统一建模语言UML 2.0的实时系统动态行为模型的形式分析方法。首先给出了UML顺序图的形式化描述,分析了UML顺序图中事件之间的关系;在此基础上,给出一种对象自动机来描述每个对象在UML顺序图描述的场景中所参与的事件序列的方法,并将该方法扩展到带有组合片段的UML 2.0顺序图;最后通过分析UML 2.0顺序图中的时间建模机制,给出了从UML 2.0顺序图中提取时间约束得到时间自动机的算法。  相似文献   

9.

Context

Modern software engineering demands professionals and researchers to proactively and collectively work towards exploring and experimenting viable and valuable mechanisms in order to extract all kinds of degenerative bugs, security holes, and possible deviations at the initial stage. Having understood the real need here, we have introduced a novel methodology for the estimation of defect proneness of class structures in object oriented (OO) software systems at design stage.

Objective

The objective of this work is to develop an estimation model that provides significant assessment of defect proneness of object oriented software packages at design phase of SDLC. This frame work enhances the efficiency of SDLC through design quality improvement.

Method

This involves a data driven methodology which is based on the empirical study of the relationship existing between design parameters and defect proneness. In the first phase, a mapping of the relationship between the design metrics and normal occurrence pattern of defects are carried out. This is represented as a set of non linear multifunctional regression equations which reflects the influence of individual design metrics on defect proneness. The defect proneness estimation model is then generated by weighted linear combination of these multifunctional regression equations. The weighted coefficients are evaluated through GQM (Goal Question Metric) paradigm.

Results

The model evaluation and validation is carried out with a selected set of cases which is found to be promising. The current study is successfully dealt with three projects and it opens up the opportunity to extend this to a wide range of projects across industries.

Conclusion

The defect proneness estimation at design stage facilitates an effective feedback to the design architect and enabling him to identify and reduce the number of defects in the modules appropriately. This results in a considerable improvement in software design leading to cost effective products.  相似文献   

10.
Systematic design testing, in which executable models of behaviours are tested using inputs that exercise scenarios, can help reveal flaws in designs before they are implemented in code. In this paper a technique for testing executable forms of UML (Unified Modelling Language) models is described and test adequacy criteria based on UML model elements are proposed. The criteria can be used to define test objectives for UML designs. The UML design test criteria are based on the same premise underlying code test criteria: coverage of relevant building blocks of models is highly likely to uncover faults. The test adequacy criteria proposed in this paper are based on building blocks for UML class and interaction diagrams. Class diagram criteria are used to determine the object configurations on which tests are run, while interaction diagram criteria are used to determine the sequences of messages that should be tested. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

11.
Independent component analysis (ICA) has been widely used to tackle the microarray dataset classification problem, but there still exists an unsolved problem that the independent component (IC) sets may not be reproducible after different ICA transformations. Inspired by the idea of ensemble feature selection, we design an ICA based ensemble learning system to fully utilize the difference among different IC sets. In this system, some IC sets are generated by different ICA transformations firstly. A multi-objective genetic algorithm (MOGA) is designed to select different biologically significant IC subsets from these IC sets, which are then applied to build base classifiers. Three schemes are used to fuse these base classifiers. The first fusion scheme is to combine all individuals in the final generation of the MOGA. In addition, in the evolution, we design a global-recording technique to record the best IC subsets of each IC set in a global-recording list. Then the IC subsets in the list are deployed to build base classifier so as to implement the second fusion scheme. Furthermore, by pruning about half of less accurate base classifiers obtained by the second scheme, a compact and more accurate ensemble system is built, which is regarded as the third fusion scheme. Three microarray datasets are used to test the ensemble systems, and the corresponding results demonstrate that these ensemble schemes can further improve the performance of the ICA based classification model, and the third fusion scheme leads to the most accurate ensemble system with the smallest ensemble size.  相似文献   

12.
The usefulness of measures for the analysis and design of object oriented (OO) software is increasingly being recognized in the field of software engineering research. In particular, recognition of the need for early indicators of external quality attributes is increasing. We investigate through experimentation whether a collection of UML class diagram measures could be good predictors of two main subcharacteristics of the maintainability of class diagrams: understandability and modifiability. Results obtained from a controlled experiment and a replica support the idea that useful prediction models for class diagrams understandability and modifiability can be built on the basis of early measures, in particular, measures that capture structural complexity through associations and generalizations. Moreover, these measures seem to be correlated with the subjective perception of the subjects about the complexity of the diagrams. This fact shows, to some extent, that the objective measures capture the same aspects as the subjective ones. However, despite our encouraging findings, further empirical studies, especially using data taken from real projects performed in industrial settings, are needed. Such further study will yield a comprehensive body of knowledge and experience about building prediction models for understandability and modifiability.
Mario PiattiniEmail:

Marcela Genero   is an Associate Professor in the Department of Information Systems and Technologies at the University of Castilla-La Mancha, Ciudad Real, Spain. She received her MSc degree in Computer Science from the University of South, Argentine in 1989, and her PhD at the University of Castilla-La Mancha, Ciudad Real, Spain in 2002. Her research interests include empirical software engineering, software metrics, conceptual data models quality, database quality, quality in product lines, quality in MDD, etc. She has published in prestigious journals (Journal of Software Maintenance and Evolution: Research and Practice, L’Objet, Data and Knowledge Engineering, Journal of Object Technology, Journal of Research and Practice in Information Technology), and conferences (CAISE, E/R, MODELS/UML, ISESE, OOIS, SEKE, etc). She edited the books of Mario Piattini and Coral Calero titled “Data and Information Quality” (Kluwer, 2001), and “Metrics for Software Conceptual Models” (Imperial College, 2005). She is a member of ISERN. M. Esperanza Manso   is an Associate Professor in the Department of Computer Language and Systems at the University of Valladolid, Valladolid, Spain. She received her MSc degree in Mathematics from the University of Valladolid. Currently, she is working towards her PhD. Her main research interests are software maintenance, reengineering and reuse experimentation. She is an author of several papers in conferences (OOIS, CAISE, METRICS, ISESE, etc.) and book chapters. Corrado Aaron Visaggio   is an Assistant Professor of Database and Software Testing at the University of Sannio, Italy. He obtained his PhD in Software Engineering at the University of Sannio. He works as a researcher at the Research Centre on Software Technology, at Benvento, Italy. His research interests include empirical software engineering, software security, software process models. He serves on the Editorial Board on the e-Informatica Journal. Gerardo Canfora   is a Full Professor of Computer Science at the Faculty of Engineering and the Director of the Research Centre on Software Technology (RCOST) at the University of Sannio in Benevento, Italy. He serves on the program committees of a number of international conferences. He was a program co-chair of the 1997 International Workshop on Program Comprehension; the 2001 International Conference on Software Maintenance; the 2003 European Conference on Software Maintenance and Reengineering; the 2005 International Workshop on Principles of Software Evolution: He was the General chair of the 2003 European Conference on Software Maintenance and Reengineering and 2006 Working Conference on Reverse Engineering. Currently, he is a program co-chair of the 2007 International Conference on Software Maintenance. His research interests include software maintenance and reverse engineering, service oriented software engineering, and experimental software engineering. He was an associate editor of IEEE Transactions on Software Engineering and he currently serves on the Editorial Board of the Journal of Software Maintenance and Evolution. He is a member of the IEEE Computer Society. Mario Piattini   is MSc and PhD in Computer Science by the Technical University of Madrid. Certified Information System Auditor by ISACA (Information System Audit and Control Association). Full Professor in the Department of Information Systems and Technologies at the University of Castilla-La Mancha, in Ciudad Real, Spain. Author of several books and papers on databases, software engineering and information systems. He leads the ALARCOS research group at the University of Castilla-La Mancha.   相似文献   

13.
Analyzing object-oriented systems in order to evaluate their quality gains its importance as the paradigm continues to increase in popularity. Consequently, several object-oriented metrics have been proposed to evaluate different aspects of these systems such as class coupling. In object-oriented design, three types of coupling may exist between classes: inheritance coupling, interaction coupling, and component coupling. This paper presents a tool for measuring inheritance coupling in object-oriented systems.  相似文献   

14.
Designers rely on performance predictions to direct the design toward appropriate requirements. Machine learning (ML) models exhibit the potential for rapid and accurate predictions. Developing conventional ML models that can be generalized well in unseen design cases requires an effective feature engineering and selection. Identifying generalizable features calls for good domain knowledge by the ML model developer. Therefore, developing ML models for all design performance parameters with conventional ML will be a time-consuming and expensive process. Automation in terms of feature engineering and selection will accelerate the use of ML models in design.Deep learning models extract features from data, which aid in model generalization. In this study, we (1) evaluate the deep learning model’s capability to predict the heating and cooling demand on unseen design cases and (2) obtain an understanding of extracted features. Results indicate that deep learning model generalization is similar to or better than that of a simple neural network with appropriate features. The reason for the satisfactory generalization using the deep learning model is its ability to identify similar design options within the data distribution. The results also indicate that deep learning models can filter out irrelevant features, reducing the need for feature selection.  相似文献   

15.
Petri net (PN) supervisory control is often performed through a sequential procedure that introduces additional constraint layers over an initial unconstrained PN model, using generalized mutual exclusion constraints (GMECs) implemented as monitor places. This is typical, e.g., in the context of flexible manufacturing systems, where the initial model represents the production sequences and the constraints are used to express static specifications, such as job limitations or the usage of resources, and behavioral ones, as liveness, controllability, etc. This sequential procedure may yield a redundant model, that is not easily reduced a posteriori. Also, it is difficult to ensure maximal permissivity with respect to multiple behavioral specifications. This paper, building on recent results regarding optimal supervisor design with branch & bound methods, proposes an integrated modeling approach that can be used to derive a minimal supervisor guaranteeing the attainment of an arbitrary set of static and behavioral specifications in a maximally permissive way. Among behavioral specifications, deadlock-freeness, liveness, reversibility and behavioral controllability are considered in the paper. The supervisor comes in the form of a simple set of GMECs or of a disjunction of sets of GMECs. Some examples emphasize the potential model size reductions that can be achieved.  相似文献   

16.
An identification problem with no a priori separation of the variables into inputs and outputs and representation invariant approximation criterion is considered. The model class consists of linear time-invariant systems of bounded complexity and the approximation criterion is the minimum of a weighted 2-norm distance between the given time series and a time series that is consistent with the model. The problem is equivalent to and is solved as a mosaic-Hankel structured low-rank approximation problem. Software implementing the approach is developed and tested on benchmark problems. Additional nonstandard features of the software are specification of exact and missing variables and identification from multiple experiments.  相似文献   

17.
Research into software design models in general, and into the UML in particular, focuses on answering the question how design models are used, completely ignoring the question if they are used. There is an assumption in the literature that the UML is the de facto standard, and that use of design models has had a profound and substantial effect on how software is designed by virtue of models giving the ability to do model-checking, code generation, or automated test generation. However for this assumption to be true, there has to be significant use of design models in practice by developers.This paper presents the results of a survey summarizing the answers of 3785 developers answering the simple question on the extent to which design models are used before coding. We relate their use of models with (i) total years of programming experience, (ii) open or closed development, (iii) educational level, (iv) programming language used, and (v) development type.The answer to our question was that design models are not used very extensively in industry, and where they are used, the use is informal and without tool support, and the notation is often not UML. The use of models decreased with an increase in experience and increased with higher level of qualification. Overall we found that models are used primarily as a communication and collaboration mechanism where there is a need to solve problems and/or get a joint understanding of the overall design in a group. We also conclude that models are seldom updated after initially created and are usually drawn on a whiteboard or on paper.  相似文献   

18.
针对机会网络中TCP与传染路由结合时,传染路由的"洪泛"特性和机会网络的间断特性会使得TCP性能变差的问题,提出了一种基于传染路由协议和TCP/Reno协议的跨层改进算法——ACK-EPI.该算法对连接开始阶段的慢开始门限值进行修改,以避免因网络错误进入拥塞避免阶段而导致拥塞窗口增长速度过于缓慢.同时,为了避免网络中已经成功交付但仍存储在网络中其他节点的数据包的复本继续在网络中存在并扩散,造成网络资源浪费,算法还利用ACK作为到达通告来删除这些冗余数据包.通过深入的仿真分析和比较,结果表明ACK-EPI算法能明显改善TCP性能.  相似文献   

19.
Cloud computing allows dynamic resource scaling for enterprise online transaction systems, one of the key characteristics that differentiates the cloud from the traditional computing paradigm. However, initializing a new virtual instance in a cloud is not instantaneous; cloud hosting platforms introduce several minutes delay in the hardware resource allocation. In this paper, we develop prediction-based resource measurement and provisioning strategies using Neural Network and Linear Regression to satisfy upcoming resource demands.Experimental results demonstrate that the proposed technique offers more adaptive resource management for applications hosted in the cloud environment, an important mechanism to achieve on-demand resource allocation in the cloud.  相似文献   

20.
ABSTRACT

The Internet of Things (IoT) holds the promise to blend real-world and online behaviors in principled ways, yet we are only beginning to understand how to effectively exploit insights from the online realm into effective applications in smart environments. Such smart environments aim to provide an improved, personalized experience based on the trail of user interactions with smart devices, but how does recommendation in smart environments differ from the usual online recommender systems? And can we exploit similarities to truly blend behavior in both realms to address the fundamental cold-start problem? In this article, we experiment with behavioral user models based on interactions with smart devices in a museum, and investigate the personalized recommendation of what to see after visiting an initial set of Point of Interests (POIs), a key problem in personalizing museum visits or tour guides, and focus on a critical one-shot POI recommendation task—where to go next? We have logged users' onsite physical information interactions during visits in an IoT-augmented museum exhibition at scale. Furthermore, we have collected an even larger set of search logs of the online museum collection. Users in both sets are unconnected, for privacy reasons we do not have shared IDs. We study the similarities between users' online digital and onsite physical information interaction behaviors, and build new behavioral user models based on the information interaction behaviors in (i) the physical exhibition space, (ii) the online collection, or (iii) both. Specifically, we propose a deep neural multilayer perceptron (MLP) based on explicitly given users' contextual information, and set-based extracted features using users' physical information interaction behaviors and similar users' digital information interaction behaviors. Our experimental results indicate that the proposed behavioral user modeling approach, using both physical and online user information interaction behaviors, improves the onsite POI recommendation baselines' performances on all evaluation metrics. Our proposed MLP approach achieves 83% precision at rank 1 on the critical one-shot POI recommendation problem, realizing the high accuracy needed for fruitful deployment in practical situations. Furthermore, the MLP model is less sensitive to amount of real-world interactions in terms of the seen POIs set-size, by backing of to the online data, hence helps address the cold start problem in recommendation. Our general conclusion is that it is possible to fruitfully combine information interactions in the online and physical world for effective recommendation in smart environments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号