首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Systematic software reuse is proposed to increase productivity and software quality and lead to economic benefits. Reports of successful software reuse programs in industry have been published. However, there has been little effort to organize the evidence systematically and appraise it. This review aims to assess the effects of software reuse in industrial contexts. Journals and major conferences between 1994 and 2005 were searched to find observational studies and experiments conducted in industry, returning eleven papers of observational type. Systematic software reuse is significantly related to lower problem (defect, fault or error) density in five studies and to decreased effort spent on correcting problems in three studies. The review found evidence for significant gains in apparent productivity in three studies. Other significant benefits of software reuse were reported in single studies or the results were inconsistent. Evidence from industry is sparse and combining results was done by vote-counting. Researchers should pay more attention to using comparable metrics, performing longitudinal studies, and explaining the results and impact on industry. For industry, evaluating reuse of COTS or OSS components, integrating reuse activities in software processes, better data collection and evaluating return on investment are major challenges.
Reidar ConradiEmail:

Parastoo Mohagheghi   is a researcher at SINTEF, Department of Information and Communication Technology (ICT). She received her Ph.D. from the Norwegian University of Science and Technology in 2004 and worked there before joining SINTEF. She has also industry experience from Ericsson in Norway. Her research interests include software quality, model driven development, software reuse, measurement and empirical software engineering. She is a member of IEEE and ACM. Reidar Conradi   received his Ph.D. in Computer Science from the Norwegian University of Science and Technology (NTNU) in 1976. From 1972 to 1975 he worked at SINTEF as a researcher. Since 1975 he has been assistant professor at NTNU and a full professor since 1985. He has participated in many national and EU projects, chaired workshops and conferences, and edited several books. His research interests are in software engineering, object-oriented methods and software reuse, distributed systems, software evolution and configuration management, software quality and software process improvement. He is a member of IEEE Computer Society and ACM.   相似文献   

2.
As with all information technologies, there is a necessity to determine the profitability of investments in Radio Frequency Identification (RFID) ex ante. A particularly important aspect is the challenging task of evaluating the multi-faceted benefits of RFID deployments. While a large body of research on RFID benefits exists, our literature review indicates the absence of a comprehensive approach. We introduce a framework that combines the benefit evaluation steps of identification, forecasting and assessment. Based on insights gained in a 3-year research project with case studies in logistics, we refine a process-based IT-benefits classification and subsequently derive six types of RFID benefits that support the systematic identification of benefits, as well as the selection of forecast and assessment methods. We discuss how our framework can facilitate and enhance RFID investment decisions and guide future research activities.  相似文献   

3.

The idea of optimization can be regarded as an important basis of many disciplines and hence is extremely useful for a large number of research fields, particularly for artificial-intelligence-based advanced control design. Due to the difficulty of solving optimal control problems for general nonlinear systems, it is necessary to establish a kind of novel learning strategies with intelligent components. Besides, the rapid development of computer and networked techniques promotes the research on optimal control within discrete-time domain. In this paper, the bases, the derivation, and recent progresses of critic intelligence for discrete-time advanced optimal control design are presented with an emphasis on the iterative framework. Among them, the so-called critic intelligence methodology is highlighted, which integrates learning approximators and the reinforcement formulation.

  相似文献   

4.
We develop a multi-objective economic model predictive control (m-econ MPC) framework to control and optimize a nonlinear mechanical pulping (MP) process. M-econ MPC interprets economic MPC as a multi-objective optimization problem that trades off economic and set-point tracking performance. This interpretation allows us to construct a stabilizing constraint that guarantees closed-loop stability. The framework infers unmeasured states of the MP process (associated with product consistency) by using a moving horizon estimator (MHE). The MP process dynamics are described by using a nonlinear Wiener model. Examples from a two-stage high-consistency MP process are employed to demonstrate that significant improvements in economic performance are achievable.  相似文献   

5.
We propose that considering four categories of task factors can facilitate knowledge elicitation efforts in the analysis of complex cognitive tasks: materials, strategies, knowledge characteristics, and goals. A study was conducted to examine the effects of altering aspects of two of these task categories on problem-solving behavior across skill levels: materials and goals. Two versions of an applied engineering problem were presented to expert, intermediate, and novice participants. Participants were to minimize the cost of running a steam generation facility by adjusting steam generation levels and flows. One version was cast in the form of a dynamic, computer-based simulation that provided immediate feedback on flows, costs, and constraint violations, thus incorporating key variable dynamics of the problem context. The other version was cast as a static computer-based model, with no dynamic components, cost feedback, or constraint checking. Experts performed better than the other groups across material conditions, and, when required, the presentation of the goal assisted the experts more than the other groups. The static group generated richer protocols than the dynamic group, but the dynamic group solved the problem in significantly less time. Little effect of feedback was found for intermediates, and none for novices. We conclude that demonstrating differences in performance in this task requires different materials than explicating underlying knowledge that leads to performance. We also conclude that substantial knowledge is required to exploit the information yielded by the dynamic form of the task or the explicit solution goal. This simple model can help to identify the contextual factors that influence elicitation and specification of knowledge, which is essential in the engineering of joint cognitive systems.  相似文献   

6.
Most authors suggest that the main benefits of Office Automation are savings which are possible in work-time when using these new systems. The assumption underlying this approach is that productivity of office work is low at present, due to the fact that, to a large extent, office activities are carried out manually. Thus, in this approach, office work is analyzed as a group of time-consuming activities.This traditional approach is limited; we present here a more realistic and powerful framework by considering the real nature of different benefits resulting from the application of QA technology in the organization. In this approach, a catergorization of benefits is presented as a useful guide for anyone who has to decide on investments in new office systems.  相似文献   

7.
DMOS-基于多种数据挖掘算法的工业优化软件系列   总被引:2,自引:2,他引:2  
基于我们多年从事炼油,化工,冶金工业优化工作的经验,参照国际上先进控制和优化工程公司的工作模式,开发了适用于生产过程优化,故障诊断,优化新产品研制和配方设计的软件系列DMOS。DMOS软件分开发软件和运用软件两大类,前者包括一个数据挖掘方法库,其中多种模式识别,支持向量机算法,线性和非线性回归以及人工神经网络组成一个信息处理的统一流程。可处理用户的数据,开发适合用户需要的DMOS运行软件。后者包括数据库,模型库和简易方法库,可直接对生产进行优化开环指导或在线控制,DMOS软件系列为化工,炼油,钢铁等行业生产过程优化的工程化运营创造了条件。  相似文献   

8.
Quality is one of the main concerns in today's systems and software development and use. One important instrument in verification is the use of formal methods, which means that requirements and designs are analyzed formally to determine their relationships. Furthermore, since professional software design is to an increasing extent a distributed process, the issue of integrating different systems to an entity is of great importance in modern system development and design. Various candidates for formalizing system development and integration have prevailed, but very often, particularly for dynamic conflict detection, these introduce non-standard objects and formalisms, leading to severe confusion, both regarding the semantics and the computability. In contrast to such, we introduce a framework for defining requirement fulfillment by designs, detecting conflicts of various kinds as well as integration of heterogeneous schemata. The framework introduced transcends ordinary logical consequence, as it takes into account static and dynamic aspects of design consistency and, in particular, the specific features of the state space of a specification. Another feature of the approach is that it provides a unifying framework for design conflict analysis and schema integration.  相似文献   

9.
ContextUbiquitous Computing (or UbiComp) represents a paradigm in which information processing is thoroughly integrated into everyday objects and activities. From a Software Engineering point of view this development scenario brings new challenges in tailoring or building software processes, impacting current software technologies. However, it has not yet been explicitly shown how to characterize a software project with the perception of ubiquitous computing.ObjectiveThis paper presents a conceptual framework to support the characterization of ubiquitous software projects according to their ubiquity adherence level. It also intends to apply such characterization approach to some projects, aiming at observing their adherence with ubiquitous computing principles.MethodTo follow a research strategy based on systematic reviews and surveys to acquire UbiComp knowledge and organize a conceptual framework regarding ubiquitous computing, which can be used to characterize UbiComp software projects. Besides, to demonstrate its application by characterizing some software projects.ResultsUbiquitous computing encapsulates at least 11 different high abstraction level characteristics represented by 123 functional and 45 restrictive factors. Based on this a checklist was organized to allow the characterization of ubiquitous software projects, which has been applied on 26 ubiquitous software projects from four different application domains (ambient intelligence, pervasive healthcare, U-learning, and urban space). No project demonstrated to support more than 65% of the characteristics set. Service omnipresence was observed in all of these projects. However, some characteristics, although identified as necessary in the checklist, were not identified in any of them.ConclusionThere are characteristics that identify a software project as ubiquitous. However, a ubiquitous software project does not necessarily have to implement all of them. The application domain can influence the appearing of UbiComp characteristics in software projects, promoting an increase of their adherence to UbiComp and, thus, for additional software technologies to deal with these ubiquitous requirements.  相似文献   

10.
The use of web services in industrial automation, e.g. in fully automated production processes like car manufacturing, promises simplified interaction among the manufacturing devices due to standardized protocols and increased flexibility with respect to process implementation and reengineering. Moreover, the adoption of web services as a seamless communication backbone within the overall industrial enterprise has additional benefits, such as simplified interaction with suppliers and customers (i.e. horizontal integration) and avoidance of a break in the communication paradigm within the enterprise (i.e. vertical integration). The Time-Constrained Services (TiCS) framework is a development and execution environment that empowers automation engineers to develop, deploy, publish, compose, and invoke time-constrained web services. TiCS consists of four functional layers—tool support layer, real-time infrastructural layer, real-time service layer, and hardware layer—which contain several components to meet the demands of a web service based automation infrastructure. This article gives an overview of the TiCS framework. More precisely, the general design considerations and an architectural blueprint of the TiCS framework are presented. Subsequently, selected key components of the TiCS framework are discussed in detail: the SOAP4PLC engine for equipping programmable logic controllers with a web service interface, the SOAP4IPC engine for processing web services in real-time on industrial PCs, the WS-TemporalPolicy language for describing time constraints, and the TiCS Modeler for composing time-constrained web services into a time-constrained BPEL4WS workflow.  相似文献   

11.
A novel approach to progress improvement of the economic performance in model predictive control (MPC) systems is developed. The conventional LQG based economic performance design provides an estimation which cannot be done by the controller while the proposed approach can develop the design performance achievable by the controller. Its optimal performance is achieved by solving economic performance design (EPD) problem and optimizing the MPC performance iteratively in contrast to the original EPD which has nonlinear LQG curve relationship. Based on the current operating data from MPC, EPD is transformed into a linear programming problem. With the iterative learning control (ILC) strategy, EPD is solved at each trial to update the tuning parameter and the designed condition; then MPC is conducted in the condition guided by EPD. The ILC strategy is proposed to adjust the tuning parameter based on the sensitivity analysis. The convergence of EPD by the proposed ILC has also been proved. The strategy can be applied to industry processes to keep enhancing the performance and to obtain the achievable optimal EPD. The performance of the proposed method is illustrated via an SISO numerical system as well as an MIMO industry process.  相似文献   

12.

Robotic process automation is a disruptive technology to automate already digital yet manual tasks and subprocesses as well as whole business processes rapidly. In contrast to other process automation technologies, robotic process automation is lightweight and only accesses the presentation layer of IT systems to mimic human behavior. Due to the novelty of robotic process automation and the varying approaches when implementing the technology, there are reports that up to 50% of robotic process automation projects fail. To tackle this issue, we use a design science research approach to develop a framework for the implementation of robotic process automation projects. We analyzed 35 reports on real-life projects to derive a preliminary sequential model. Then, we performed multiple expert interviews and workshops to validate and refine our model. The result is a framework with variable stages that offers guidelines with enough flexibility to be applicable in complex and heterogeneous corporate environments as well as for small and medium-sized companies. It is structured by the three phases of initialization, implementation, and scaling. They comprise eleven stages relevant during a project and as a continuous cycle spanning individual projects. Together they structure how to manage knowledge and support processes for the execution of robotic process automation implementation projects.

  相似文献   

13.
Requirements Engineering - Organizations have increasingly applied agile project management; however, they face challenges in scaling up this approach to large projects. Thus, this study...  相似文献   

14.
Literature tends to discuss software (and system) requirements quality control, which includes validation and verification, as a heterogeneous process using a great variety of relatively independent techniques. Also, process-oriented thinking prevails. In this paper, we attempt to promote the point that this important activity must be studied as a coherent entity. It cannot be seen as a rather mechanical process of checking documents either. Validation, especially, is more an issue of communicating requirements, as constructed by the analysts, back to the stakeholders whose goals those requirements are supposed to meet, and to all those other stakeholders, with whose goals those requirements may conflict. The main problem, therefore, is that of achieving a sufficient level of understanding of the stated requirements by a particular stakeholder, which may be hindered by, for example, lack of technical expertise. In this paper, we develop a unifying framework for requirements quality control. We reorganize the existing knowledge around the issue of communicating requirements to all the different stakeholders, instead of just focusing on some techniques and processes. We hope that this framework could clarify thinking in the area, and make future research a little more focused.  相似文献   

15.
Planning and scheduling activities have a significant impact on the performance of manufacturing enterprises. Throughout the 1980s there was a belief that computer‐led solutions would “solve” complex industrial planning and scheduling problems. However, in most manufacturing organizations, planning and scheduling still require significant human support to ensure effective performance. Although the contribution of these human resources is often highly valued, we are only beginning to develop a coherent body of knowledge that can contribute toward the successful integration of human and computer‐based planning and scheduling systems. Here we examine the state of knowledge in this domain and identify the need for field investigations. We present a framework to facilitate research in human and organizational issues in planning and scheduling in manufacturing. A structured and detailed set of research questions is developed to underpin field studies. The framework focuses on understanding the scheduling environment, the process of scheduling, and related performance issues. The application of the framework is illustrated using our own field studies, where a number of specific research questions of practical importance have been identified: what scheduling is, who carries it out, what influences scheduling practice and performance, how schedulers actually schedule, what makes a good scheduler and schedule, and what support is needed. The framework makes a valuable contribution to advancing knowledge in an area of real practical benefit to contemporary manufacturing industry. © 2001 John Wiley & Sons, Inc.  相似文献   

16.
Shop floor controllers have been applied widely in the integration and control of the information flow in computer integrated manufacturing systems. Among the developed control frameworks, hierarchical control structure is the most popular and practical. One of the major arguments about this framework is that computers with strong computing power should be applied in the central controller, and computers with less computing power can be applied in the local controllers. However, very few experimental analyses support this argument. In this research, the information loading for each level of controller is used to examine this argument. A coloured stochastic Petri nets model was developed to describe the information flow in a hierarchical structure. The developed model was simulated and the information loading for each controller can be estimated on the basis of the traffic of the tokens in the net. The simulation results have confirmed the above argument. In addition, the required computation resources for certain functions in hierarchical frameworks were also estimated. These data can be used for replanning the functions performed in each level of the controllers in order to improve the total performance of a hierarchical shop floor controller system.  相似文献   

17.
The daily operation of wastewater treatment plants (WWTPs) in unitary sewer systems of industrialized areas is of special concern. Severe problems can occur due to the characteristics of incoming flow. In order to avoid decision that leads to hazardous situations, guidelines and regulations exist. However, there are still no golden standards by which to a priori decide whether a WWTP can cope with critical discharges. Strict adherence to regulations may not always be convenient, since special circumstances may motivate operators to accept discharges that are above established thresholds or to reject discharges that comply with guidelines. Nonetheless, such decisions must be well justified. This paper proposes an argumentation-based model by which to formulate a flexible decision-making process. An example of the model application describes how experts deliberate the safety of a discharge and adapt each decision to the particular characteristics of the industrial discharge and the WWTP.  相似文献   

18.
The Cray T3D and T3E are non-cache-coherent (NCC) computers with a NUMA structure. They have been shown to exhibit a very stable and scalable performance for a variety of application programs. Considerable evidence suggests that they are more stable and scalable than many other shared-memory multiprocessors. However, the principal drawback of these machines is a lack of programmability, caused by the absence of the global cache coherence that is necessary to provide a convenient shared view of memory in hardware. This forces the programmer to keep careful track of where each piece of data is stored, a complication that is unnecessary when a pure shared-memory view is presented to the user. We believe that a remedy for this problem is advanced compiler technology. In this paper, we present our experience with a compiler framework for automatic parallelization and communication generation that has the potential to reduce the time-consuming hand-tuning that would otherwise be necessary to achieve good performance with this type of machine. From our experiments, we learned that our compiler performs well for a variety of applications on the T3D and T3E and we found a few sophisticated techniques that could improve performance even more once they are fully implemented in the compiler  相似文献   

19.
A key objective of industrial advanced process control (APC) projects is to stabilize the process operation. In order to justify the cost associated with the introduction of new APC technologies to a process, the benefits have to be quantified in economic terms. In the past, economic assessment methods have been developed that link the variation of key controlled process variables to economic performance quantities. This paper reviews these methods and incorporates them in a framework for the economic evaluation of APC projects. A web-based survey on the economic assessment of process control has been completed by over 60 industrial APC experts. The results give information about the state-of-the-art assessment of economic benefits of advanced process control.  相似文献   

20.
A maximum likelihood framework for determining moving edges   总被引:3,自引:0,他引:3  
The determination of moving edges in an image sequence is discussed. An approach is proposed that relies on modeling principles and likely hypothesis testing techniques. A spatiotemporal edge in an image sequence is modeled as a surface patch in a 3-D spatiotemporal space. A likelihood ratio test enables its detection as well as simultaneous estimation of its related attributes. It is shown that the computation of this test leads to convolving the image sequence with a set of predetermined masks. The emphasis is on a restricted but widely relevant and useful case of surface patch, namely the planar one. In addition, an implementation of the procedure whose computation cost is merely equivalent to a spatial gradient operator is presented. This method can be of interest for motion-analysis schemes, not only for supplying spatiotemporal segmentation, but also for extracting local motion information. Moreover, it can cope with occlusion contours and important displacement magnitude. Experiments have been carried out with both synthetic and real images  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号