共查询到20条相似文献,搜索用时 0 毫秒
1.
Many modern organizations integrate enterprise resource planning (ERP) and supply chain management (SCM) systems, as they work in a complementary fashion. This often results in technical and organizational challenges. Neway, a Chinese organization, recently went through this complex process. This required efficient procurement and management of hardware, software, and human resources for successful completion. The integrated system was found to improve operations, foster a paperless environment, and provide efficient inventory tracking and picking. It also had several tangible benefits, including reduced lead time and improved inventory accuracy. ERP and SCM systems integration is still a novel concept for a Chinese manufacturing organization. Our case study details the organization's experience, identifies challenges that were faced, and describes solutions adopted to overcome them. 相似文献
2.
This paper presents a quantitative framework for early prediction of resource usage and load in distributed real-time systems
(DRTS). The prediction is based on an analysis of UML 2.0 sequence diagrams, augmented with timing information, to extract
timed-control flow information. It is aimed at improving the early predictability of a DRTS by offering a systematic approach
to predict, at the design phase, system behavior in each time instant during its execution. Since behavioral models such as
sequence diagrams are available in early design phases of the software life cycle, the framework enables resource analysis
at a stage when design decisions are still easy to change. Though we provide a general framework, we use network traffic as
an example resource type to illustrate how the approach is applied. We also indicate how usage and load analysis of other
types of resources (e.g., CPU and memory) can be performed in a similar fashion. A case study illustrates the feasibility
of the approach.
相似文献
Yvan LabicheEmail: |
3.
Uliana Corazza Roberto FilippiniRoberto Setola 《Computer methods and programs in biomedicine》2011,102(3):305-316
Proton therapy is a type of particle therapy which utilizes a beam of protons to irradiate diseased tissue. The main difference with respect to conventional radiotherapy (X-rays, γ-rays) is the capability to target tumors with extreme precision, which makes it possible to treat deep-seated tumors and tumors affecting noble tissues as brain, eyes, etc. However, proton therapy needs high-energy cyclotrons and this requires sophisticated control-supervision schema to guarantee, further than the prescribed performance, the safety of the patients and of the operators. In this paper we present the modeling and simulation of the irradiation process of the PROSCAN facility at the Paul Scherrer Institut. This is a challenging task because of the complexity of the operation scenario, which consists of deterministic and stochastic processes resulting from the coordination-interaction among diverse entities such as distributed automatic control systems, safety protection systems, and human operators. 相似文献
4.
Development and integration of a reactive real-time decision support system in the aluminum industry
Benoît Saenz de Ugarte Adnne Hajji Robert Pellerin Abdelhakim Artiba 《Engineering Applications of Artificial Intelligence》2009,22(6):897-905
This paper aims at providing real-time decision support in reaction to disruptive events in manufacturing environments. More precisely, we demonstrate how this approach can support the rescheduling process in enterprise resource planning (ERP)-controlled environments. Our demonstrative example, based on a real scenario in the aluminum industry, illustrates how a genetic algorithm and a real-time discrete event simulation model can be integrated within the common enterprise information systems. 相似文献
5.
Binoy Ravindran Peng Li Tamir Hegazy 《Journal of Parallel and Distributed Computing》2003,63(12):1219-1242
We present two proactive resource allocation algorithms, RBA*-FT and OBA-FT, for fault-tolerant asynchronous real-time distributed systems. The algorithms consider an application model where task timeliness is specified by Jensen's benefit functions and the anticipated application workload during future time intervals is described by adaptation functions. In addition, we assume that reliability functions of processors are available a priori. Given these models, our objective is to maximize aggregate task benefit and minimize aggregate missed deadline ratio in the presence of processor failures. Since determining the optimal solution is computationally intractable, the algorithms heuristically compute sub-optimal resource allocations, but in polynomial time. Experimental results reveal that RBA*-FT and OBA-FT outperform their non-fault-tolerant counterparts in the presence of processor failures. Furthermore, RBA*-FT performs better than OBA-FT, although OBA-FT incurs better worst-case and amortized computational costs. Finally, we observe that both algorithms robustly withstand errors in the estimation of anticipated failures. 相似文献
6.
Software performance is an important non-functional quality attribute and software performance evaluation is an essential activity in the software development process. Especially in embedded real-time systems, software design and evaluation are driven by the needs to optimize the limited resources, to respect time deadlines and, at the same time, to produce the best experience for end-users. Software product family architectures add additional requirements to the evaluation process. In this case, the evaluation includes the analysis of the optimizations and tradeoffs for the whole products in the family. Performance evaluation of software product family architectures requires knowledge and a clear understanding of different domains: software architecture assessments, software performance and software product family architecture. We have used a scenario-driven approach to evaluate performance and dynamic memory management efficiency in one Nokia software product family architecture. In this paper we present two case studies. Furthermore, we discuss the implications and tradeoffs of software performance against evolvability and maintenability in software product family architectures. 相似文献
7.
The integration of Management Information Bases (MIBs) is one of the challenges of integrated network management. This is made more difficult by the existence of many different ways for structuring the MIB and defining managed objects. This paper addresses the issue of integrating GDMO-based MIBs created on the basis of different Management Information Models (MIMs). Three MIMs (NMF Library Release 1. 1, ITU-T M. 3100 and ETSI GOM) are analyzed with the help of a simple network configuration, and some MIM comparison criteria are proposed. The criteria can be used to assess the dificulty of integrating MIBs based on those models. 相似文献
8.
A process computer was installed in a large integrated nylon plant in 1976. This dedicated chilled water management system was designed to optimize the operation of chillers and to reduce their energy costs. The computer system was also configured to provide maximum visibility of the operating parameters, machine efficiencies, instrumentation integrity checks, historical data, and energy usage accounting. This paper describes details of the control and optimization strategies, including the overall benefits of the system based on five years of successful operating experience. 相似文献
9.
Knowledge hoarding and user acceptance of online discussion board systems in eLearning: A case study
This paper aims to reveal the determinants of the effectiveness of online discussion board systems (ODBSs) in eLearning environments to foster the interactions among the learners and/or instructors. A case in which an ODBS failed to foster the interactions among learners/instructors for knowledge sharing is introduced and hypotheses to explain the failure are developed based on thorough literature review in technology acceptance model (TAM) and knowledge hoarding. The hypotheses are tested via statistical analysis on the data collected from a questionnaire survey against the students who actually involved in the case study. The result shows that the low perceived usefulness of the ODBS by the students played major role in the failure of the system. Also it is hinted that network externalities as an intrinsic motivator is more effective than extrinsic motivators to increase the students’ activities on the ODBS. Finally the paper provides the designers of eLearning systems with advice for successful operation of ODBS in eLearning. 相似文献
10.
Hajime Nagahama 《AI & Society》1998,12(4):251-263
Japan's educational system has some major problems. The most important among these concerns is the basic concept of the educational process and the goal of education. The old concept of public educational systems has become outdated in today's Japanese society, although this concept had supported social and spiritual faith, economic success and selfless devotion to one's country for more than 100 years. Now, Japanese people need a new concept of the educational process and the goal of education for the twenty-first century. The paper proposes a value chain of educational and learning systems aimed at building a network consisting of multiple fields for fostering future human resources. 相似文献
11.
12.
The Cognitive Work Analysis Design Toolkit (CWA-DT) is a recently developed approach that provides guidance and tools to assist in applying the outputs of CWA to design processes to incorporate the values and principles of sociotechnical systems theory. In this paper, the CWA-DT is evaluated based on an application to improve safety at rail level crossings. The evaluation considered the extent to which the CWA-DT met pre-defined methodological criteria and aligned with sociotechnical values and principles. Both process and outcome measures were taken based on the ratings of workshop participants and human factors experts. Overall, workshop participants were positive about the process and indicated that it met the methodological criteria and sociotechnical values. However, expert ratings suggested that the CWA-DT achieved only limited success in producing RLX designs that fully aligned with the sociotechnical approach. Discussion about the appropriateness of the sociotechnical approach in a public safety context is provided.
Practitioner Summary: Human factors and ergonomics practitioners need evidence of the effectiveness of methods. A design toolkit for cognitive work analysis, incorporating values and principles from sociotechnical systems theory, was applied to create innovative designs for rail level crossings. Evaluation results based on the application are provided and discussed. 相似文献
13.
Alan W. Brown David J. Carney Alan M. Christie Ed J. Morris Paul E. Zarrella W. Michael Caldwell 《Journal of Systems Integration》1994,4(3):195-218
In order to investigate the capabilities of varous types of integration infrastructure, the CASE Environments project at the Software Engineering Institute has conducted a series of studies integrating a variety of tools using framework technologies. This paper discusses one of these studies, in which a Software Engineering Environment was first modeled using a number of process notations and then constructed using control- and data-oriented frameworks. Public domain, commercial, and custom tools were integrated in support of the defined process scenario. 相似文献
14.
Many school systems, in both the developed and developing world, are implementing educational technology to assist in student learning. However, there is no clear consensus on how to evaluate these new technologies. This paper proposes a comprehensive methodology for estimating the value of a new educational technology in three steps: benefit analysis, through the administration of a well-designed experiment; cost analysis, which incorporates costs to weigh against the benefits; and feasibility analysis, which introduces real-world concerns that may affect the ability to actually implement the technology. To illustrate the methodology, a case study from Chile is used where portable educational video games were introduced into first and second grade classrooms with the aim of improving learning in mathematics and language. This paper demonstrates the importance of all three steps in the evaluation process and provides a framework for future analyses. 相似文献
15.
The recent advances in sensor and communication technologies can provide the foundations for linking the physical manufacturing facility and machine world to the cyber world of Internet applications. The coupled manufacturing cyber-physical system is envisioned to handle the actual operations in the physical world while simultaneously monitor them in the cyber world with the help of advanced data processing and simulation models at both the manufacturing process and system operational levels. Moreover, a sensor-packed manufacturing system in which each process or piece of equipment makes available event and status information, coupled with market research for true advanced Big Data analytics, seem to be the right ingredients for event response selection and operation virtualization. As a drawback, the resulting manufacturing cyber-physical system will be vulnerable to the inevitable cyber-attacks, unfortunately, so common for the software and Internet-based systems. This reality makes cybersecurity penetration within the manufacturing domain a need that goes uncontested across researchers and practitioners. This work provides a review of the current status of virtualization and cloud-based services for manufacturing systems and of the use of Big Data analytics for planning and control of manufacturing operations. Building on already developed cloud business solutions, cloud manufacturing is expected to offer improved enterprise manufacturing and business decision support. Based on the current state-of-the-art cloud manufacturing solutions and Big Data applications, this work also proposes a framework for the development of predictive manufacturing cyber-physical systems that include capabilities for attaching to the Internet of Things, and capabilities for complex event processing and Big Data algorithmic analytics. 相似文献
16.
Packages are important high-level organizational units for large object-oriented systems. Package-level metrics characterize the attributes of packages such as size, complexity, and coupling. There is a need for empirical evidence to support the collection of these metrics and using them as early indicators of some important external software quality attributes. In this paper, three suites of package-level metrics (Martin, MOOD and CK) are evaluated and compared empirically in predicting the number of pre-release faults and the number of post-release faults in packages. Eclipse, one of the largest open source systems, is used as a case study. The results indicate that the prediction models that are based on Martin suite are more accurate than those that are based on MOOD and CK suites across releases of Eclipse. 相似文献
17.
18.
A contribution to the development of strategic control and planning instruments: An acquisition case study 总被引:1,自引:0,他引:1
A. Chevalier P. L. Kunsch J. P. Brans 《International Transactions in Operational Research》2004,11(2):155-168
The present paper is part of the efforts made by the authors in recent years to develop strategic control and planning instruments in corporations using OR‐techniques like system dynamics, control theory, and group multicriteria decision aid. A more general framework called ‘adaptive control methodology’ (ACM) combines all these techniques. It has been presented in several papers. The objective of the present analysis is to calibrate this instrument and to tune it to the corporate needs by analysing real‐world applications. More specifically, several case studies have been investigated in large multinational organisations in the food sector. An acquisition case has been used for the calibration purpose. It is analysed in the paper from the ACM perspective to provide additional material for revisiting and improving the methodology. 相似文献
19.
Danielle Lottridge Mark Chignell Sharon E. Straus 《International Journal of Industrial Ergonomics》2011,41(3):208-218
Requirements analysis defines the goals and evaluation criteria of system design. We introduce a methodology for requirements analysis for customization based on large sample interactive testing, with the premise that analysis of user behaviour with prototypes leads to requirements discoveries. The methodology uses a relatively large sample (1) to identify relevant user subgroups, (2) to observe significant empirically determined group differences in the context of task and tool use and (3) to estimate the groups’ different requirements and derive design implications. Between 20 and 50 participants are used per test, rather than the three to five often recommended for user testing. Statistical relationships are investigated between subgroups in terms of background variables, questionnaire items, performance data, and coded verbal statements. Customization requirements are inferred from the significant differences observed between empirically determined groups. The methodological framework is illustrated in a case study involving the use of clinical resources on handheld devices by three groups of physicians. The groups were found to have different needs and preferences for evidence-based resources and device form factor, implying opportunities and necessities for group customization requirements.
Relevance to industry
In safety-critical domains such as health care, it is essential to assess user needs and preferences regarding devices and systems to inform appropriate customizations. We present a methodological framework and case study that demonstrates how large sample user testing can supplement typical methods of requirements analysis to provide contextualized, quantitative accounts of group differences and customization requirements. 相似文献20.
This paper proposes an approach to modular modelling and simulation of complex time-critical systems. The modelling language is represented by Merlin and Farber’s Time Petri Nets (TPNs) augmented with inhibitor arcs and modular constructs borrowed from the Petri Net Markup Language (PNML) interchange format. Analysis techniques depend on Temporal Uncertainty Time Warp (TUTW), a time warp algorithm capable of exploiting temporal uncertainty in general optimistic simulations over a networked context. A key feature of the approach is the fact that TPN models naturally exhibit a certain degree of temporal uncertainty which the TUTW control engine can exploit to achieve good speedup without a loss in the accuracy of the simulation results. The developed TUTW/TPN kernel is demonstrated by modelling and simulation of a real-time system example.A preliminary version of this paper was presented at 38th SCS Annual Simulation Symposium, April 4–6, 2005, San Diego (CA), IEEE Computer Society, pp. 233–240.
Franco Cicirelli achieved a PhD in computer science from the University of Calabria (Unical), DEIS—department of electronics informatics and systems science. As a postdoc, he is making research on agent and service paradigms for the development of distributed systems, parallel simulation, Petri nets, distributed measurement systems. He holds a membership with ACM.
Angelo Furfaro, PhD, is a computer science assistant professor at Unical, DEIS, teaching object-oriented programming. His research interests are centred on: multi-agent systems, modeling and analysis of time-dependent systems, Petri nets, parallel simulation, verification of real-time systems, distributed measurement systems. He is a member of ACM.
Libero Nigro is a full professor of computer science at Unical, DEIS, where he teaches object-oriented programming, software engineering and real-time systems courses. He directs the Software Engineering Laboratory (www.lis.deis.unical.it). His current research interests include: software engineering of time-dependent and distributed systems, real-time systems, Petri nets, modeling and parallel simulation of complex systems, distributed measurement systems. Prof. Nigro is a member of ACM and IEEE. 相似文献