首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 873 毫秒
1.
刘亚珺  李兵  李增扬  梁鹏  吴闽泉 《计算机科学》2017,44(11):15-21, 40
软件技术债务是运用经济学中“债务”的概念来描述软件开发中因项目短期利益而实施的技术折中。但从长期来看,技术债务会影响软件的质量、成本和开发效率,因此有必要对其进行有效管理。现有的技术债务管理工具数量少且存在各种局限性,难以实现有效的管理。主流的软件集成开发环境功能强大且应用广泛,可以为技术债务管理服务。以具有代表性的集成开发环境Visual Studio 2015企业版为研究对象,通过C#实例发现其管理4类与代码直接相关的技术债务的能力,并将其与4种专门的技术债务管理工具进行对比,为开发团队的日常实践提供技术债务管理支持。结果表明,Visual Studio能够提供更好的技术债务管理功能,并能应用多种方法对项目中存在的各类技术债务进行不同程度的管理。  相似文献   

2.
Technical debt is a metaphor for delayed software maintenance tasks. Incurring technical debt may bring short-term benefits to a project, but such benefits are often achieved at the cost of extra work in future, analogous to paying interest on the debt. Currently technical debt is managed implicitly, if at all. However, on large systems, it is too easy to lose track of delayed tasks or to misunderstand their impact. Therefore, we have proposed a new approach to managing technical debt, which we believe to be helpful for software managers to make informed decisions. In this study we explored the costs of the new approach by tracking the technical debt management activities in an on-going software project. The results from the study provided insights into the impact of technical debt management on software projects. In particular, we found that there is a significant start-up cost when beginning to track and monitor technical debt, but the cost of ongoing management soon declines to very reasonable levels.  相似文献   

3.
BackgroundSoftware evolution is an important topic in software engineering. It generally deals with large amounts of data, as one must look at whole project histories as opposed to their current snapshot. Software visualization is the field of software engineering that aims to help people to understand software through the use of visual resources. It can be effectively used to analyze and understand the large amount of data produced during software evolution.ObjectiveThis study investigates Software Evolution Visualization (SEV) approaches, collecting evidence about how SEV research is structured, synthesizing current evidence on the goals of the proposed approaches and identifying key challenges for its use in practice.MethodsA mapping study was conducted to analyze how the SEV area is structured. Selected primary studies were classified and analyzed with respect to nine research questions.ResultsSEV has been used for many different purposes, especially for change comprehension, change prediction and contribution analysis. The analysis identified gaps in the studies with respect to their goals, strategies and approaches. It also pointed out to a widespread lack of empirical studies in the area.ConclusionResearchers have proposed many SEV approaches during the past years, but some have failed to clearly state their goals, tie them back to concrete problems, or formally validate their usefulness. The identified gaps indicate that there still are many opportunities to be explored in the area.  相似文献   

4.
Abstract

Expert systems hold great promise for technical application areas such as medical diagnosis or engineering design. They are, we argue, less promising for management applications. The reason is that managers are not experts in the sense of possessing a formal body of knowledge which they apply. The limitations of artificial intelligence approaches in managerial domains is explained in terms of semantic change, motivating attention toward management (decision( support systems.  相似文献   

5.
ContextSoftware documents are core artifacts produced and consumed in documentation activity in the software lifecycle. Meanwhile, knowledge-based approaches have been extensively used in software development for decades, however, the software engineering community lacks a comprehensive understanding on how knowledge-based approaches are used in software documentation, especially documentation of software architecture design.ObjectiveThe objective of this work is to explore how knowledge-based approaches are employed in software documentation, their influences to the quality of software documentation, and the costs and benefits of using these approaches.MethodWe use a systematic literature review method to identify the primary studies on knowledge-based approaches in software documentation, following a pre-defined review protocol.ResultsSixty studies are finally selected, in which twelve quality attributes of software documents, four cost categories, and nine benefit categories of using knowledge-based approaches in software documentation are identified. Architecture understanding is the top benefit of using knowledge-based approaches in software documentation. The cost of retrieving information from documents is the major concern when using knowledge-based approaches in software documentation.ConclusionsThe findings of this review suggest several future research directions that are critical and promising but underexplored in current research and practice: (1) there is a need to use knowledge-based approaches to improve the quality attributes of software documents that receive less attention, especially credibility, conciseness, and unambiguity; (2) using knowledge-based approaches with the knowledge content in software documents which gets less attention in current applications of knowledge-based approaches in software documentation, to further improve the practice of software documentation activity; (3) putting more focus on the application of software documents using the knowledge-based approaches (knowledge reuse, retrieval, reasoning, and sharing) in order to make the most use of software documents; and (4) evaluating the costs and benefits of using knowledge-based approaches in software documentation qualitatively and quantitatively.  相似文献   

6.
ContextFormal methods, and particularly formal verification, is becoming more feasible to use in the engineering of large highly dependable software-based systems, but so far has had little rigorous empirical study. Its artefacts and activities are different to those of conventional software engineering, and the nature and drivers of productivity for formal methods are not yet understood.ObjectiveTo develop a research agenda for the empirical study of productivity in software projects using formal methods and in particular formal verification. To this end we aim to identify research questions about productivity in formal methods, and survey existing literature on these questions to establish face validity of these questions. And further we aim to identify metrics and data sources relevant to these questions.MethodWe define a space of GQM goals as an investigative framework, focusing on productivity from the perspective of managers of projects using formal methods. We then derive questions for these goals using Easterbrook et al.’s (2008) taxonomy of research questions. To establish face validity, we document the literature to date that reflects on these questions and then explore possible metrics related to these questions. Extensive use is made of literature concerning the L4.verified project completed within NICTA, as it is one of the few projects to achieve code-level formal verification for a large-scale industrially deployed software system.ResultsWe identify more than thirty research questions on the topic in need of investigation. These questions arise not just out of the new type of project context, but also because of the different artefacts and activities in formal methods projects. Prior literature supports the need for research on the questions in our catalogue, but as yet provides little evidence about them. Metrics are identified that would be needed to investigate the questions. Thus although it is obvious that at the highest level concepts such as size, effort, rework and so on are common to all software projects, in the case of formal methods, measurement at the micro level for these concepts will exhibit significant differences.ConclusionsEmpirical software engineering for formal methods is a large open research field. For the empirical software engineering community our paper provides a view into the entities and research questions in this domain. For the formal methods community we identify some of the benefits that empirical studies could bring to the effective management of large formal methods projects, and list some basic metrics and data sources that could support empirical studies. Understanding productivity is important in its own right for efficient software engineering practice, but can also support future research on cost-effectiveness of formal methods, and on the emerging field of Proof Engineering.  相似文献   

7.
Target setting in software quality function deployment (SQFD) is very important since it is directly related to development of high quality products with high customer satisfaction. However target setting is usually done subjectively in practice, which is not scientific. Two quantitative approaches for setting target values: benchmarking and primitive linear regression have been developed and applied in the past to overcome this problem (Akao and Yoji, 1990). But these approaches cannot be used to assess the impact of unachieved targets on satisfaction of customers for customer requirements. In addition, both of them are based on linear regression and not very practical in many applications. In this paper, we present an innovative quantitative method of setting technical targets in SQFD to enable analysis of impact of unachieved target values on customer satisfaction. It is based on assessment of impact of technical attributes on satisfaction of customer requirements. In addition both linear and non linear regression techniques are utilized in our method, which certainly improves the existing quantitative methods which are based on only linear regression. Frank Liu is currently an associate professor and a director of the McDonnel Douglass Foundation software engineering laboratory in the University of Missouri-Rolla. He has been working on requirements engineering, software quality management, and knowledge based software engineering since 1992. He has published about 50 papers in peer-reviewed journals and conferences in the above areas and several other software engineering application areas. He participates in research projects with a total amount of funds of more than four millions dollars as a PI or Co-PI sponsored by the National Science Foundation, Sandia National Laboratory, U.S. Air Force, University of Missouri Research Board, and Toshiba Corporation. He has served as a program committee member for many conferences. He was a program committee vice chair for the 2000 International Conference on Software Engineering and Knowledge Engineering. Kunio Noguchi is a senior quality expert in the software engineering center in the Toshiba Corporation. He has published several papers in the area of quality management systems. Anuj Dhungana was a M.S. graduate student in the computer science department at the Texas Tech University when he performed this research. V.V.N.S.N. Srirangam A. was a M.S. graduate student in the computer science department at the Texas Tech University when he performed this research. Praveen Inuganti was a M.S. graduate student in the computer science department at the University of Missouri-Rolla when he performed this research.  相似文献   

8.
ContextSoftware testing is a key aspect of software reliability and quality assurance in a context where software development constantly has to overcome mammoth challenges in a continuously changing environment. One of the characteristics of software testing is that it has a large intellectual capital component and can thus benefit from the use of the experience gained from past projects. Software testing can, then, potentially benefit from solutions provided by the knowledge management discipline. There are in fact a number of proposals concerning effective knowledge management related to several software engineering processes.ObjectiveWe defend the use of a lesson learned system for software testing. The reason is that such a system is an effective knowledge management resource enabling testers and managers to take advantage of the experience locked away in the brains of the testers. To do this, the experience has to be gathered, disseminated and reused.MethodAfter analyzing the proposals for managing software testing experience, significant weaknesses have been detected in the current systems of this type. The architectural model proposed here for lesson learned systems is designed to try to avoid these weaknesses. This model (i) defines the structure of the software testing lessons learned; (ii) sets up procedures for lesson learned management; and (iii) supports the design of software tools to manage the lessons learned.ResultsA different approach, based on the management of the lessons learned that software testing engineers gather from everyday experience, with two basic goals: usefulness and applicability.ConclusionThe architectural model proposed here lays the groundwork to overcome the obstacles to sharing and reusing experience gained in the software testing and test management. As such, it provides guidance for developing software testing lesson learned systems.  相似文献   

9.
Technical debt is considered detrimental to the long-term success of software development, but despite the numerous studies in the literature, there are still many aspects that need to be investigated for a better understanding of it. In particular, the main problems that hinder its complete understanding are the absence of a clear definition and a model for its identification, management, and forecasting. Focusing on forecasting technical debt, there is a growing notion that preventing technical debt build-up allows you to identify and address the riskiest debt items for the project before they can permanently compromise it. However, despite this high relevance, the forecast of technical debt is still little explored. To this end, this study aims to evaluate whether the quality metrics of a software system can be useful for the correct prediction of the technical debt. Therefore, the data related to the quality metrics of 8 different open-source software systems were analyzed and supplied as input to multiple machine learning algorithms to perform the prediction of the technical debt. In addition, several partitions of the initial dataset were evaluated to assess whether prediction performance could be improved by performing a data selection. The results obtained show good forecasting performance and the proposed document provides a useful approach to understanding the overall phenomenon of technical debt for practical purposes.  相似文献   

10.
Technical debt is a metaphor for seeking short-term gains at expense of long-term code quality. Previous studies have shown that self-admitted technical debt, which is introduced intentionally, has strong negative impacts on software development and incurs high maintenance overheads. To help developers identify self-admitted technical debt, researchers have proposed many state-of-the-art methods. However, there is still room for improvement about the effectiveness of the current methods, as self-admitted technical debt comments have the characteristics of length variability, low proportion and style diversity. Therefore, in this paper, we propose a novel approach based on the bidirectional long short-term memory (BiLSTM) networks with the attention mechanism to automatically detect self-admitted technical debt by leveraging source code comments. In BiLSTM, we utilize a balanced cross entropy loss function to overcome the class unbalance problem. We experimentally investigate the performance of our approach on a public dataset including 62, 566 code comments from ten open source projects. Experimental results show that our approach achieves 81.75% in terms of precision, 72.24% in terms of recall and 75.86% in terms of F1-score on average and outperforms the state-of-the-art text mining-based method by 8.14%, 5.49% and 6.64%, respectively.  相似文献   

11.
Software product management covers both technical and business activities to management of products like roadmaps, strategic, tactical, and release planning. In practice, one product manager is seldom responsible for all these activities but several persons share the responsibilities. Therefore, it is important to understand the boundaries of product managers’ work in managing software products, as well as the impact a product manager has on the company business. The purpose of the study is to clarify what roles of software product managers exist and understand how these roles are interrelated with each other and the whole structure and business of an organization. The study is designed as an interpretative qualitative study using grounded theory as the research method. Based on the gathered data we developed a framework that reveals the role of a product manager in the organization and shows how this role can evolve by extending the level of responsibilities. Using the framework, we identified four stereotypical roles of product managers in the studied organizations: experts, strategists, leaders, and problem solvers. The presented framework shows that product managers’ roles are not limited to the conception of the “mini-CEO.” The results allow product managers and top management to collaborate effectively by assigning responsibilities and managing expectations by having a common tool for understanding the role of product managers in the organization.  相似文献   

12.

Agile is often associated with a lack of architectural thinking causing technical debt but has the advantage of user centricity and a strong focus on value. Model-driven software engineering (MDSE) strongly performs for building a quality architecture and code, but lacks focus on user requirements and tends to consider development as a monolithic whole. The combination of Agile and MDSE has been explored, but a convincing integrated method has not been proposed yet. This paper addresses this gap by exploring the specific combination of MERODE—as an example of a proven MDSE method—with Scrum, a reference agile method offering a concrete (sprint-based) life cycle management on the basis of user stories. The method resulting of this integration is called Agile MERODE; it is driven by user stories, themselves associated with behavior-driven development scenarios. It allows for domain-driven design and permits fast development from domain models by means of code generation. An illustrative example further clarifies the practical application of Agile MERODE, while a case study shows the planning game application in the case’s context. While the approach, in its entirety, allows reducing technical debt by building the architecture in a logical, consistent and complete manner, introducing MDSE involves a trade-off with pure value-driven development. Agile MERODE contributes to the state of the art by showing how to increase user centricity in MDSE, how to align model-driven engineering with the Scrum cycle, and how to reduce the technical debt of agile developments yet remaining value-focused.

  相似文献   

13.
ContextGlobal Software Engineering (GSE) continues to experience substantial growth and is fundamentally different to collocated development. As a result, software managers have a pressing need for support in how to successfully manage teams in a global environment. Unfortunately, de facto process frameworks such as the Capability Maturity Model Integration (CMMI®) do not explicitly cater for the complex and changing needs of global software management.ObjectiveTo develop a Global Teaming (GT) process area to address specific problems relating to temporal, cultural, geographic and linguistic distance which will meet the complex and changing needs of global software management.MethodWe carried out three in-depth case studies of GSE within industry from 1999 to 2007. To supplement these studies we conducted three literature reviews. This allowed us to identify factors which are important to GSE. Based on a gap analysis between these GSE factors and the CMMI®, we developed the GT process area. Finally, the literature and our empirical data were used to identify threats to software projects if these processes are not implemented.ResultsOur new GT process area brings together practices drawn from the GSE literature and our previous empirical work, including many socio-technical factors important to global software development. The GT process area presented in this paper encompasses recommended practices that can be used independently or with existing models. We found that if managers are not proactive in implementing new GT practices they are putting their projects under threat of failure. We therefore include a list of threats that if ignored could have an adverse effect on an organization’s competitive advantage, employee satisfaction, timescales, and software quality.ConclusionThe GT process area and associated threats presented in this paper provides both a guide and motivation for software managers to better understand how to manage technical talent across the globe.  相似文献   

14.
ContextKnowledge management technologies have been employed across software engineering activities for more than two decades. Knowledge-based approaches can be used to facilitate software architecting activities (e.g., architectural evaluation). However, there is no comprehensive understanding on how various knowledge-based approaches (e.g., knowledge reuse) are employed in software architecture.ObjectiveThis work aims to collect studies on the application of knowledge-based approaches in software architecture and make a classification and thematic analysis on these studies, in order to identify the gaps in the existing application of knowledge-based approaches to various architecting activities, and promising research directions.MethodA systematic mapping study is conducted for identifying and analyzing the application of knowledge-based approaches in software architecture, covering the papers from major databases, journals, conferences, and workshops, published between January 2000 and March 2011.ResultsFifty-five studies were selected and classified according to the architecting activities they contribute to and the knowledge-based approaches employed. Knowledge capture and representation (e.g., using an ontology to describe architectural elements and their relationships) is the most popular approach employed in architecting activities. Knowledge recovery (e.g., documenting past architectural design decisions) is an ignored approach that is seldom used in software architecture. Knowledge-based approaches are mostly used in architectural evaluation, while receive the least attention in architecture impact analysis and architectural implementation.ConclusionsThe study results show an increased interest in the application of knowledge-based approaches in software architecture in recent years. A number of knowledge-based approaches, including knowledge capture and representation, reuse, sharing, recovery, and reasoning, have been employed in a spectrum of architecting activities. Knowledge-based approaches have been applied to a wide range of application domains, among which “Embedded software” has received the most attention.  相似文献   

15.
ContextVariability management is a key activity in software product line engineering. This paper focuses on managing rationale information during the decision-making activities that arise during variability management. By decision-making we refer to systematic problem solving by considering and evaluating various alternatives. Rationale management is a branch of science that enables decision-making based on the argumentation of stakeholders while capturing the reasons and justifications behind these decisions.ObjectiveDecision-making should be supported to identify variability in domain engineering and to resolve variation points in application engineering. We capture the rationale behind variability management decisions. The captured rationale information is useful to evaluate future changes of variability models as well as to handle future instantiations of variation points. We claim that maintaining rationale will enhance the longevity of variability models. Furthermore, decisions should be performed using a formal communication between domain engineering and application engineering.MethodWe initiate the novel area of issue-based variability management (IVM) by extending variability management with rationale management. The key contributions of this paper are: (i) an issue-based variability management methodology (IVMM), which combines questions, options and criteria (QOC) and a specific variability approach; (ii) a meta-model for IVMM and a process for variability management and (iii) a tool for the methodology, which was developed by extending an open source rationale management tool.ResultsRationale approaches (e.g. questions, options and criteria) guide distributed stakeholders when selecting choices for instantiating variation points. Similarly, rationale approaches also aid the elicitation of variability and the evaluation of changes. The rationale captured within the decision-making process can be reused to perform future decisions on variability.ConclusionIVMM was evaluated comparatively based on an experimental survey, which provided evidence that IVMM is more effective than a variability modeling approach that does not use issues.  相似文献   

16.
Abstract

Different languages, tools, and techniques are used for the development of software systems, including database and knowledge-based systems. Although underlying languages employ structuring concepts such as classification, modularization, generalization, and perspectives, these common concepts remain overshadowed by differing terminologies and notations, due to the separate histories of software engineering, databases, and knowledge representation. Currently the still more complex and ambitious requirements on software systems call for integrated solutions concerning software engineering environments. As a starting point toward integration, in this paper we aim at deriving a common structural level for software systems. To approach this goal we start by analyzing the human thought process on one hand and successfully applied structuring techniques on the other hand to derive a catalogue of 10 structuring concepts. Building on that, a self-contained language called SFW (structuring framework) is introduced to provide means for a general and uniform specification of the structure of software systems. SFW is aimed at providing a catalogue of reference for structuring concepts in today's languages as well as a suggestion to establish a uniform structural level in future approaches.  相似文献   

17.
ABSTRACT

Care managers play a key role in coordinating care, especially for patients with chronic conditions. They use multiple health information technology (IT) applications in order to access, process, and communicate patient-related information. Using the work system model and its extension, the Systems Engineering Initiative for Patient Safety (SEIPS) model, we describe obstacles experienced by care managers in managing patient-related information. A web-based questionnaire was used to collect data from 80 care managers (61% response rate) located in clinics, hospitals, and a call center. Care managers were more likely to consider “inefficiencies in access to patient-related information” and “having to use multiple information systems” as major obstacles than “lack of computer training and support” and “inefficient use of case management software.” Care managers who reported “inefficient use of case management software” as an obstacle were more likely to report high workload. Future research should explore strategies used by care managers to address obstacles, and efforts should be targeted at improving the health information technologies used by care managers.  相似文献   

18.
ContextService-Orientation (SO) is a rapidly emerging paradigm for the design and development of adaptive and dynamic software systems. Software Product Line Engineering (SPLE) has also gained attention as a promising and successful software reuse development paradigm over the last decade and proven to provide effective solutions to deal with managing the growing complexity of software systems.ObjectiveThis study aims at characterizing and identifying the existing research on employing and leveraging SO and SPLE.MethodWe conducted a systematic mapping study to identify and analyze related literature. We identified 81 primary studies, dated from 2000–2011 and classified them with respect to research focus, types of research and contribution.ResultThe mapping synthesizes the available evidence about combining the synergy points and integration of SO and SPLE. The analysis shows that the majority of studies focus on service variability modeling and adaptive systems by employing SPLE principles and approaches.In particular, SPLE approaches, especially feature-oriented approaches for variability modeling, have been applied to the design and development of service-oriented systems. While SO is employed in software product line contexts for the realization of product lines to reconcile the flexibility, scalability and dynamism in product derivations thereby creating dynamic software product lines.ConclusionOur study summarizes and characterizes the SO and SPLE topics researchers have investigated over the past decade and identifies promising research directions as due to the synergy generated by integrating methods and techniques from these two areas.  相似文献   

19.
ContextThe field of IT processes lacks a scientifically-based tool that constructs organisation-specific IT processes according to the organisation’s socio-technical characteristics.ObjectiveIn this paper we propose a solution to this problem in the form of IT process engineering (ITPE). ITPE is based on established method engineering principles which we have adapted to IT process construction.MethodThe tool was demonstrated by having three organisations use ITPE to each construct two IT processes.ResultsITPE provided useful guidance in all three cases.ConclusionsThe study demonstrates that method engineering principles can be applied in research fields other than information system development.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号