首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.

Context

Although agile software development methods such as SCRUM and DSDM are gaining popularity, the consequences of applying agile principles to software product management have received little attention until now.

Objective

In this paper, this gap is filled by the introduction of a method for the application of SCRUM principles to software product management.

Method

A case study research approach is employed to describe and evaluate this method.

Results

This has resulted in the ‘agile requirements refinery’, an extension to the SCRUM process that enables product managers to cope with complex requirements in an agile development environment. A case study is presented to illustrate how agile methods can be applied to software product management.

Conclusions

The experiences of the case study company are provided as a set of lessons learned that will help others to apply agile principles to their software product management process.  相似文献   

2.

Context

Systems development normally takes place in a specific organizational context, including organizational culture. Previous research has identified organizational culture as a factor that potentially affects the deployment systems development methods.

Objective

The purpose is to analyze the relationship between organizational culture and the post-adoption deployment of agile methods.

Method

This study is a theory development exercise. Based on the Competing Values Model of organizational culture, the paper proposes a number of hypotheses about the relationship between organizational culture and the deployment of agile methods.

Results

Inspired by the agile methods thirteen new hypotheses are introduced and discussed. They have interesting implications, when contrasted with ad hoc development and with traditional systems development methods.

Conclusion

Because of the conceptual richness of organizational culture and the ambiguity of the concept of agility the relationship between organizational culture and the deployment of agile systems development forms a rich and interesting research topic. Recognizing that the Competing Values Model represents just one view of organizational culture, the paper introduces a number of alternative conceptions and identifies several interesting paths for future research into the relationship between organizational culture and agile methods deployment.  相似文献   

3.

Context

Staff turnover in organizations is an important issue that should be taken into account mainly for two reasons:
1.
Employees carry an organization’s knowledge in their heads and take it with them wherever they go
2.
Knowledge accessibility is limited to the amount of knowledge employees want to share

Objective

The aim of this work is to provide a set of guidelines to develop knowledge-based Process Asset Libraries (PAL) to store software engineering best practices, implemented as a wiki.

Method

Fieldwork was carried out in a 2-year training course in agile development. This was validated in two phases (with and without PAL), which were subdivided into two stages: Training and Project.

Results

The study demonstrates that, on the one hand, the learning process can be facilitated using PAL to transfer software process knowledge, and on the other hand, products were developed by junior software engineers with a greater degree of independence.

Conclusion

PAL, as a knowledge repository, helps software engineers to learn about development processes and improves the use of agile processes.  相似文献   

4.

Context

A particular strength of agile systems development approaches is that they encourage a move away from ‘introverted’ development, involving the customer in all areas of development, leading to more innovative and hence more valuable information system. However, a move toward open innovation requires a focus that goes beyond a single customer representative, involving a broader range of stakeholders, both inside and outside the organisation in a continuous, systematic way.

Objective

This paper provides an in-depth discussion of the applicability and implications of open innovation in an agile environment.

Method

We draw on two illustrative cases from industry.

Results

We highlight some distinct problems that arose when two project teams tried to combine agile and open innovation principles. For example, openness is often compromised by a perceived competitive element and lack of transparency between business units. In addition, minimal documentation often reduce effective knowledge transfer while the use of short iterations, stand-up meetings and presence of on-site customer reduce the amount of time for sharing ideas outside the team.

Conclusion

A clear understanding of the inter- and intra-organisational applicability and implications of open innovation in agile systems development is required to address key challenges for research and practice.  相似文献   

5.

Context

Extreme Programming (XP) is one of the most popular agile software development methodologies. XP is defined as a consistent set of values and practices designed to work well together, but lacks practices for project management and especially for supporting the customer role. The customer representative is constantly under pressure and may experience difficulties in foreseeing the adequacy of a release plan.

Objective

To assist release planning in XP by structuring the planning problem and providing an optimization model that suggests a suitable release plan.

Method

We develop an optimization model that generates a release plan taking into account story size, business value, possible precedence relations, themes, and uncertainty in velocity prediction. The running-time feasibility is established through computational tests. In addition, we provide a practical heuristic approach to velocity estimation.

Results

Computational tests show that problems with up to six themes and 50 stories can be solved exactly. An example provides insight into uncertainties affecting velocity, and indicates that the model can be applied in practice.

Conclusion

An optimization model can be used in practice to enable the customer representative to take more informed decisions faster. This can help adopting XP in projects where plan-driven approaches have traditionally been used.  相似文献   

6.

Context

Many organizations have started to deploy agile methods, but so far there exist only a few studies on organization-wide transformations. Are agile methods here to stay? Some claim that agile software development methods are in the mainstream adoption phase in the software industry, while others hope that those are a passing fad. The assumption here is that if agile would not provide real improvement, adopters would be eager at first but turn pessimistic after putting it into practice.

Objective

Despite the growing amount of anecdotal evidence on the success of agile methods across a wide range of different real-life development settings, scientific studies remain scarce. Even less is known about the perception of the impacts of agile transformation when it is deployed in a very large software development environment, and whether agile methods are here to stay. This study aims to fill that gap by providing evidence from a large-scale agile transformation within Nokia. While we have yet to confirm these findings with solid quantitative data, we believe that the perception of the impacts already pinpoints the direction of the impacts of large-scale agile transformation.

Method

The data were collected using a questionnaire. The population of the study contains more than 1000 respondents in seven different countries in Europe, North America, and Asia.

Results

The results reveal that most respondents agree on all accounts with the generally claimed benefits of agile methods. These benefits include higher satisfaction, a feeling of effectiveness, increased quality and transparency, increased autonomy and happiness, and earlier detection of defects. Finally, 60% of respondents would not like to return to the old way of working.

Conclusion

While the perception of the impact of agile methods is predominantly positive, several challenge areas were discovered. However, based on this study, agile methods are here to stay.  相似文献   

7.

Context

Agile software development with its emphasis on producing working code through frequent releases, extensive client interactions and iterative development has emerged as an alternative to traditional plan-based software development methods. While a number of case studies have provided insights into the use and consequences of agile, few empirical studies have examined the factors that drive the adoption and use of agile.

Objective

We draw on intention-based theories and a dialectic perspective to identify factors driving the use of agile practices among adopters of this software development methodology.

Method

Data for the study was gathered through an anonymous online survey of software development professionals. We requested participation from members of a selected list of online discussion groups, and received 98 responses.

Results

Our analyses reveal that subjective norm and training play a significant role in influencing software developers’ use of agile processes and methods, while perceived benefits and perceived limitations are not primary drivers of agile use among adopters. Interestingly, perceived benefit emerges as a significant predictor of agile use only if adopters face hindrances to their agile practices.

Conclusion

We conclude that research in the adoption of software development innovations should examine the effects of both enabling and detracting factors and the interactions between them. Since training, subjective norm, and the interplay between perceived benefits and perceived hindrances appear to be key factors influencing the adoption of agile methods, researchers can focus on how to (a) perform training on agile methods more effectively, (b) facilitate the dialog between developers and managers about perceived benefits and hindrances, and (c) capitalize on subjective norm to publicize the benefits of agile methods within an organization. Further, when managing the transition to new software development methods, we recommend that practitioners adapt their strategies and tactics contingent on the extent of perceived hindrances to the change.  相似文献   

8.

Context

Agile information systems development (ISD) has received much attention from both the practitioner and researcher community over the last 10-15 years. However, it is still unclear what precisely constitutes agile ISD.

Objective

Based on four empirical studies conducted over a 10-year time period from 1999 to 2008 the objective of this paper is to show how the meaning and practice of agile ISD has evolved over time and on this basis to speculate about what comes next.

Method

Four phases of research has been conducted, using a grounded theory approach. For each research phase qualitative interviews were held in American and/or Danish companies and a grounded theory was inductively discovered by careful data analysis. Subsequently, the four unique theories have been analyzed for common themes, and a global theory was identified across the empirical data.

Results

In 1999 companies were developing software at high-speed in a desperate rush to be first-to-market. In 2001 a new high-speed/quick results development process had become established practice. In 2003 changes in the market created the need for a more balanced view on speed and quality, and in 2008 companies were successfully combining agile and plan-driven approaches to achieve the benefits of both. The studies reveal a two-stage pattern in which dramatic changes in the market causes disruption of established practices and process adaptations followed by consolidation of lessons learnt into a once again stable software development process.

Conclusion

The cyclical history of punctuated process evolution makes it possible to distinguish pre-agility from current practices (agility), and on this basis, to speculate about post-agility: a possible next cycle of software process evolution concerned with proactively pursuing the dual goal of agility and alignment through a diversity of means.  相似文献   

9.

Context

One of the difficulties faced by software development Project Managers is estimating the cost and schedule for new projects. Previous industry surveys have concluded that software size and cost estimation is a significant technical area of concern. In order to estimate cost and schedule it is important to have a good understanding of the size of the software product to be developed. There are a number of techniques used to derive software size, with function points being amongst the most documented.

Objective

In this paper we explore the utility of function point software sizing techniques when applied to two levels of software requirements documentation in a commercial software development organisation. The goal of the research is to appraise the value (cost/benefit) which functional sizing techniques can bring to the project planning and management of software projects within a small-to-medium sized software development enterprise (SME).

Method

Functional counts were made at the bid and detailed functional specification stages for each of five commercial projects used in the research. Three variants of the NESMA method were used to determine these function counts. Through a structured interview session, feedback on the sizing results was obtained to evaluate its feasibility and potential future contribution to the company.

Results

The results of our research suggest there is value in performing size estimates at two appropriate stages in the software development lifecycle, with simplified methods providing the optimal return on effort expended.

Conclusion

The ‘Estimated NESMA’ is the most appropriate tool for use in size estimation for the company studied. The use of software sizing provides a valuable contribution which would augment, but not replace, the company’s existing cost estimation approach.  相似文献   

10.

Context

Data warehouse conceptual design is based on the metaphor of the cube, which can be derived from either requirement-driven or data-driven methodologies. Each methodology has its own advantages. The first allows designers to obtain a conceptual schema very close to the user needs but it may be not supported by the effective data availability. On the contrary, the second ensures a perfect traceability and consistence with the data sources—in fact, it guarantees the presence of data to be used in analytical processing—but does not preserve from missing business user needs. To face this issue, the necessity emerged in the last years to define hybrid methodologies for conceptual design.

Objective

The objective of the paper is to use a hybrid methodology based on different multidimensional models in order to gather all advantages of each of them.

Method

The proposed methodology integrates the requirement-driven strategy with the data-driven one, in that order, possibly performing alterations of functional dependencies on UML multidimensional schemas reconciled with data sources.

Results

As case study, we illustrate how our methodology can be applied to the university environment. Furthermore, we evaluate quantitatively the benefits of this methodology by comparing it with some popular and conventional methodologies.

Conclusion

In conclusion, we highlight how the hybrid methodology improves the conceptual schema quality. Finally, we outline our present work devoted to introduce automatic design techniques in the methodology on the basis of the logical programming.  相似文献   

11.
12.

Context

Assessing software quality at the early stages of the design and development process is very difficult since most of the software quality characteristics are not directly measurable. Nonetheless, they can be derived from other measurable attributes. For this purpose, software quality prediction models have been extensively used. However, building accurate prediction models is hard due to the lack of data in the domain of software engineering. As a result, the prediction models built on one data set show a significant deterioration of their accuracy when they are used to classify new, unseen data.

Objective

The objective of this paper is to present an approach that optimizes the accuracy of software quality predictive models when used to classify new data.

Method

This paper presents an adaptive approach that takes already built predictive models and adapts them (one at a time) to new data. We use an ant colony optimization algorithm in the adaptation process. The approach is validated on stability of classes in object-oriented software systems and can easily be used for any other software quality characteristic. It can also be easily extended to work with software quality predictive problems involving more than two classification labels.

Results

Results show that our approach out-performs the machine learning algorithm C4.5 as well as random guessing. It also preserves the expressiveness of the models which provide not only the classification label but also guidelines to attain it.

Conclusion

Our approach is an adaptive one that can be seen as taking predictive models that have already been built from common domain data and adapting them to context-specific data. This is suitable for the domain of software quality since the data is very scarce and hence predictive models built from one data set is hard to generalize and reuse on new data.  相似文献   

13.

Context

Writing software for the current generation of parallel systems requires significant programmer effort, and the community is seeking alternatives that reduce effort while still achieving good performance.

Objective

Measure the effect of parallel programming models (message-passing vs. PRAM-like) on programmer effort.

Design, setting, and subjects

One group of subjects implemented sparse-matrix dense-vector multiplication using message-passing (MPI), and a second group solved the same problem using a PRAM-like model (XMTC). The subjects were students in two graduate-level classes: one class was taught MPI and the other was taught XMTC.

Main outcome measures

Development time, program correctness.

Results

Mean XMTC development time was 4.8 h less than mean MPI development time (95% confidence interval, 2.0-7.7), a 46% reduction. XMTC programs were more likely to be correct, but the difference in correctness rates was not statistically significant (p = .16).

Conclusions

XMTC solutions for this particular problem required less effort than MPI equivalents, but further studies are necessary which examine different types of problems and different levels of programmer experience.  相似文献   

14.

Purpose

The purpose of this paper is to characterize reconciliation among the plan-driven, agile, and free/open source software models of software development.

Design/methodology/approach

An automated quasi-systematic review identified 42 papers, which were then analyzed.

Findings

The main findings are: there exist distinct - organization, group and process - levels of reconciliation; few studies deal with reconciliation among the three models of development; a significant amount of work addresses reconciliation between plan-driven and agile development; several large organizations (such as Microsoft, Motorola, and Philips) are interested in trying to combine these models; and reconciliation among software development models is still an open issue, since it is an emerging area and research on most proposals is at an early stage.

Research limitations

Automated searches may not capture relevant papers in publications that are not indexed. Other data sources not amenable to execution of the protocol were not used. Data extraction was performed by only one researcher, which may increase the risk of threats to internal validity.

Implications

This characterization is important for practitioners wanting to be current with the state of research. This review will also assist the scientific community working with software development processes to build a common understanding of the challenges that must be faced, and to identify areas where research is lacking. Finally, the results will be useful to software industry that is calling for solutions in this area.

Originality/value

There is no other systematic review on this subject, and reconciliation among software development models is an emerging area. This study helps to identify and consolidate the work done so far and to guide future research. The conclusions are an important step towards expanding the body of knowledge in the field.  相似文献   

15.

Context

During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage.

Objective

This paper describes a new, simple and practical method for assessing our confidence in a set of requirements.

Method

We identified four confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project.

Results

We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money.

Conclusion

Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.  相似文献   

16.

Context

In order to ensure high quality of a process model repository, refactoring operations can be applied to correct anti-patterns, such as overlap of process models, inconsistent labeling of activities and overly complex models. However, if a process model collection is created and maintained by different people over a longer period of time, manual detection of such refactoring opportunities becomes difficult, simply due to the number of processes in the repository. Consequently, there is a need for techniques to detect refactoring opportunities automatically.

Objective

This paper proposes a technique for automatically detecting refactoring opportunities.

Method

We developed the technique based on metrics that can be used to measure the consistency of activity labels as well as the extent to which processes overlap and the type of overlap that they have. We evaluated it, by applying it to two large process model repositories.

Results

The evaluation shows that the technique can be used to pinpoint the approximate location of three types of refactoring opportunities with high precision and recall and of one type of refactoring opportunity with high recall, but low precision.

Conclusion

We conclude that the technique presented in this paper can be used in practice to automatically detect a number of anti-patterns that can be corrected by refactoring.  相似文献   

17.
We consider scheduling of unit-length jobs with release times and deadlines, where the objective is to minimize the number of gaps in the schedule. Polynomial-time algorithms for this problem are known, yet they are rather inefficient, with the best algorithm running in time \(O(n^4)\) and requiring \(O(n^3)\) memory. We present a greedy algorithm that approximates the optimum solution within a factor of 2 and show that our analysis is tight. Our algorithm runs in time \(O(n^2 \log n)\) and needs only O(n) memory. In fact, the running time is \(O(n (g^*+1)\log n)\), where \(g^*\) is the minimum number of gaps.  相似文献   

18.

Context

Software product lines (SPL) are used in industry to achieve more efficient software development. However, the testing side of SPL is underdeveloped.

Objective

This study aims at surveying existing research on SPL testing in order to identify useful approaches and needs for future research.

Method

A systematic mapping study is launched to find as much literature as possible, and the 64 papers found are classified with respect to focus, research type and contribution type.

Results

A majority of the papers are of proposal research types (64%). System testing is the largest group with respect to research focus (40%), followed by management (23%). Method contributions are in majority.

Conclusions

More validation and evaluation research is needed to provide a better foundation for SPL testing.  相似文献   

19.

Context

Offshore software development outsourcing is a modern business strategy for developing high quality software at low cost.

Objective

The objective of this research paper is to identify and analyse factors that are important in terms of the competitiveness of vendor organisations in attracting outsourcing projects.

Method

We performed a systematic literature review (SLR) by applying our customised search strings which were derived from our research questions. We performed all the SLR steps, such as the protocol development, initial selection, final selection, quality assessment, data extraction and data synthesis.

Results

We have identified factors such as cost-saving, skilled human resource, appropriate infrastructure, quality of product and services, efficient outsourcing relationships management, and an organisation's track record of successful projects which are generally considered important by the outsourcing clients. Our results indicate that appropriate infrastructure, cost-saving, and skilled human resource are common in three continents, namely Asia, North America and Europe. We identified appropriate infrastructure, cost-saving, and quality of products and services as being common in three types of organisations (small, medium and large). We have also identified four factors-appropriate infrastructure, cost-saving, quality of products and services, and skilled human resource as being common in the two decades (1990-1999 and 2000-mid 2008).

Conclusions

Cost-saving should not be considered as the driving factor in the selection process of software development outsourcing vendors. Vendors should rather address other factors in order to compete in the OSDO business, such as skilled human resource, appropriate infrastructure and quality of products and services.  相似文献   

20.

Context

An operational test means a system test that examines whether or not all software or hardware components comply with the requirements given to a system which is deployed in an operational environment.

Objective

It is a necessary lightweight-profiling method for embedded systems with severe resource restrictions to conduct operational testing.

Method

We focus on the Process Control Block as the optimal location to monitor the execution of all processes. We propose a profiling method to collect the runtime execution information of the processes without interrupting the system’s operational environment by hacking the Process Control Block information. Based on the proposed method applied to detect runtime memory faults, we develop the operational testing tool AMOS v1.0 which is currently being used in the automobile industry.

Results

An industrial field study on 23 models of car-infotainment systems revealed a total of 519 memory faults while only slowing down the system by 0.084-0.132×. We conducted a comparative analysis on representative runtime memory fault detection tools. This analysis result shows our proposed method that has relatively low overhead meets the requirements for operational testing, while other methods failed to satisfy the operational test conditions.

Conclusion

We conclude that a lightweight-profiling method for embedded system operational testing can be built around the Process Control Block.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号