首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these collections are often not, or only partially, available to researchers. On the other hand, research on techniques for managing process model collections, such as techniques for process retrieval, requires large collections for evaluation purposes. Therefore, this paper proposes a technique to generate such collections of process models, based on the properties of real-world collections. Where existing techniques focus on the structure of the process models, the technique proposed in this paper also generates task labels that consists of words from real-life task labels and considers semantic information of node and edge types. We evaluate our technique by applying it to generate two synthetic collections of process models of over 60,000 and over 2,000 models, respectively. We show that the generated synthetic collections have similar properties to the original collections. To the best of our knowledge, this is the first technique that can generate synthetic BPMN models, thus enabling experimentation with process collections that have laboratory-set quantitative parameters and qualitative properties that are based on real-world process model collections.  相似文献   

2.
Modern large new product developments (NPD) are typically characterized by many uncertainties and frequent changes. Often the embedded software development projects working on such products face many problems compared to traditional, placid project environments. One of the major project management decisions is then the selection of the project's software process model. An appropriate process model helps coping with the challenges, and prevents many potential project problems. On the other hand, an unsuitable process choice causes additional problems. This paper investigates the software process model selection in the context of large market-driven embedded software product development for new telecommunications equipment. Based on a quasi-formal comparison of publicly known software process models including modern agile methodologies, we propose a process model selection frame, which the project manager can use as a systematic guide for (re)choosing the project's process model. A novel feature of this comparative selection model is that we make the comparison against typical software project problem issues. Some real-life project case examples are examined against this model. The selection matrix expresses how different process models answer to different questions, and indeed there is not a single process model that would answer all the questions. On the contrary, some of the seeds to the project problems are in the process models themselves. However, being conscious of these problems and pitfalls when steering a project enables the project manager to master the situation.  相似文献   

3.
Thin shells are crucially dependent on their shape in order to obtain proper structural performance. In this context, the optimal shape will guarantee performance and safety requirements, while minimizing the use of materials, as well as construction/maintenance costs.Thin shell design is a team-based, multidisciplinary, and iterative process, which requires a high level of interaction between the various parties involved, especially between the Architecture and Engineering teams. As a result of technological development, novel concepts and tools become available to support this process. On the one hand, concepts like Integrated Project Delivery (IPD) show the potential to have a high impact on multidisciplinary environments such as the one in question, supporting the early decision-making process with the availability of as much information as possible. On the other hand, optimization techniques and tools should be highlighted, as they fit the needs and requirements of both the shell shape definition process and the IPD concept. These can be used not only to support advanced design stages, but also to facilitate the initial formulation of shape during the early interactions between architect and structural engineer from an IPD point of view.This paper proposes a methodology aimed at enhancing the interactive and iterative process associated with the early stages of thin shell design, supported by an integrated framework. The latter is based on several tools, namely Rhinoceros 3D, Grasshopper, and Robot Structural Analysis. In order to achieve full integration of the support tools, a custom devised module was developed, so as to allow interoperability between Grasshopper and Robot Structural Analysis. The system resorts to various technologies targeted at improving the shell shape definition process, such as formfinding techniques, parametric and generative models, as well as shape optimization techniques that leverage on multi criteria evolutionary algorithms. The proposed framework is implemented in a set of fictitious scenarios, in which the best thin reinforced concrete shell structures are sought according to given design requirements. Results stemming from this implementation emphasize its interoperability, flexibility, and capability to promote interaction between the elements of the design team, ultimately outputting a set of diverse and creative shell shapes, and thus supporting the pre-design process.  相似文献   

4.
Organizations that adopt process modeling often maintain several co-existing models of the same business process. These models target different abstraction levels and stakeholder perspectives. Maintaining consistency among these models has become a major challenge for such organizations. Although several academic works have discussed this challenge, little empirical investigation exists on how people perform process model consistency management in practice. This paper aims to address this lack by presenting an in-depth empirical study of a business-driven engineering process deployed at a large company in the banking sector. We analyzed more than 70 business process models developed by the company, including their change history, with over 1,000 change requests. We also interviewed 9 business and IT practitioners and surveyed 23 such practitioners to understand concrete difficulties in consistency management, the rationales for the specification-to-implementation refinements found in the models, strategies that the practitioners use to detect and fix inconsistencies, and how tools could help with these tasks. Our contribution is a set of eight empirical findings, some of which confirm or contradict previous works on process model consistency management found in the literature. The findings provide empirical evidence of (1) how business process models are created and maintained, including a set of recurrent patterns used to refine business-level process specifications into IT-level models; (2) what types of inconsistencies occur; how they are introduced; and what problems they cause; and (3) what stakeholders expect from tools to support consistency management.  相似文献   

5.
6.
Many companies have adopted Process-aware Information Systems (PAIS) to support their business processes in some form. On the one hand these systems typically log events (e.g., in transaction logs or audit trails) related to the actual business process executions. On the other hand explicit process models describing how the business process should (or is expected to) be executed are frequently available. Together with the data recorded in the log, this situation raises the interesting question “Do the model and the log conform to each other?”. Conformance checking, also referred to as conformance analysis, aims at the detection of inconsistencies between a process model and its corresponding execution log, and their quantification by the formation of metrics. This paper proposes an incremental approach to check the conformance of a process model and an event log. First of all, the fitness between the log and the model is measured (i.e., “Does the observed process comply with the control flow specified by the process model?”). Second, the appropriateness of the model can be analyzed with respect to the log (i.e., “Does the model describe the observed process in a suitable way?”). Appropriateness can be evaluated from both a structural and a behavioral perspective. To operationalize the ideas presented in this paper a Conformance Checker has been implemented within the ProM framework, and it has been evaluated using artificial and real-life event logs.  相似文献   

7.
The increased adoption of business process management approaches, tools, and practices has led organizations to accumulate large collections of business process models. These collections can easily include from a hundred to a thousand models, especially in the context of multinational corporations or as a result of organizational mergers and acquisitions. A concrete problem is thus how to maintain these large repositories in such a way that their complexity does not hamper their practical usefulness as a means to describe and communicate business operations. This paper proposes a technique to automatically infer suitable names for business process models and fragments thereof. This technique is useful for model abstraction scenarios, as for instance when user-specific views of a repository are required, or as part of a refactoring initiative aimed to simplify the repository's complexity. The technique is grounded in an adaptation of the theory of meaning to the realm of business process models. We implemented the technique in a prototype tool and conducted an extensive evaluation using three process model collections from practice and a case study involving process modelers with different experience.  相似文献   

8.
Workflow management systems are becoming a relevant support for a large class of business applications, and many workflow models as well as commercial products are currently available. While the large availability of tools facilitates the development and the fulfilment of customer requirements, workflow application development still requires methodological guidelines that drive the developers in the complex task of rapidly producing effective applications. In fact, it is necessary to identify and model the business processes, to design the interfaces towards existing cooperating systems, and to manage implementation aspects in an integrated way. This paper presents the WIRES methodology for developing workflow applications under a uniform modelling paradigm – UML modelling tools with some extensions – that covers all the life cycle of these applications: from conceptual analysis to implementation. High-level analysis is performed under different perspectives, including a business and an organisational perspective. Distribution, interoperability and cooperation with external information systems are considered in this early stage. A set of “workflowability” criteria is provided in order to identify which candidate processes are suited to be implemented as workflows. Non-functional requirements receive particular emphasis in that they are among the most important criteria for deciding whether workflow technology can be actually useful for implementing the business process at hand. The design phase tackles aspects of concurrency and cooperation, distributed transactions and exception handling. Reuse of component workflows, available in a repository as workflow fragments, is a distinguishing feature of the method. Implementation aspects are presented in terms of rules that guide in the selection of a commercial workflow management system suitable for supporting the designed processes, coupled with guidelines for mapping the designed workflows onto the model offered by the selected system.  相似文献   

9.
This paper investigates the use of 64-bit ARM cores to improve the processing efficiency of upcoming HPC systems. It describes a set of available tools, models and platforms, and their combination in an efficient methodology for the design space exploration of large manycore computing clusters. Experimentations and results using representative benchmarks allow to set an exploration approach to evaluate essential design options at micro-architectural level while scaling with a large number of cores. We then apply this methodology to examine the validity of SoC partitioning as an alternative to using large SoC designs based on coherent multi-SoC models and the proposed SoC Coherent Interconnect (SCI).  相似文献   

10.
This paper investigates some reasoning issues involved in developing an integrated modeling environment called a model management system. A model management system is a computer-based environment designed to support effective development and utilization of quantitative decision models. Since the construction of decision models is a knowledge-intensive process, reasoning plays a critical role. Reasoning is particularly important when automated model integration is needed in a large-scale system. In this case, heuristics are required to reduce the complexity of the process. This paper examines the planning and automated formulation of complex models from smaller building blocks. First, a hierarchy of abstractions that integrates previous contributions in model management is presented. Then, a modeling process is formulated as a planning process by which a set of operators are scheduled to achieve a specific goal. This process involves searches for alternatives that can eliminate the difference between the initial stale and the goal state. Various reasoning strategies and heuristic evaluation Junctions that can be used to improve the efficiency of developing a master plan for model integration are discussed.  相似文献   

11.
农作物叶绿素含量遥感估算的研究进展与展望   总被引:2,自引:0,他引:2  
叶绿素是农作物生长过程中重要的生化参数之一,其含量对农作物长势监测、病虫害监测、成熟期预测都有重要意义。介绍了现有遥感监测农作物叶绿素含量的模型基本原理与方法,总结了国内外在该领域的主要研究成果,进一步将模型进行分类,并分别针对经验模型、物理模型和耦合模型进行详细论述,分析了模型的优缺点及其适用范围,根据遥感估算农作物叶绿素含量的模型研究中存在的问题,对未来估算模型的发展趋势进行了展望。  相似文献   

12.
Abstract: Knowledge is an inherently dynamic entity continuously changing and evolving. In many cases, the coexistence of different versions of the same core knowledge is a necessity. So is the availability of the proper environment and tools to deal with knowledge versioning. In this paper, a framework of knowledge versioning management is proposed and implemented dealing with hybrid knowledge representation models using frames and rules. This framework facilitates knowledge version handling and maintenance, improving, in parallel, knowledge sharing and reuse. Knowledge components are stored in a set of tables and handled as data under the auspices of a database management system. The proper structure of tables and their relationships allows the creation of independent knowledge modules. Several knowledge modules can be assembled to construct higher level modules, which finally form versions of knowledge. Corresponding knowledge base versions consist of several knowledge modules easy to handle and process in various application areas. The proposed framework has been implemented and thoroughly examined in an application area of great importance, such as pest management.  相似文献   

13.
ContextAlthough SPEM 2.0 has great potential for software process modeling, it does not provide concepts or formalisms for precise modeling of process behavior. Indeed, SPEM fails to address process simulation, execution, monitoring and analysis, which are important activities in process management. On the other hand, BPMN 2.0 is a widely used notation to model business processes that has associated tools and techniques to facilitate the aforementioned process management activities. Using BPMN to model software development processes can leverage BPMN’s infrastructure to improve the quality of these processes. However, BPMN lacks an important feature to model software processes: a mechanism to represent process tailoring.ObjectiveThis paper proposes BPMNt, a conservative extension to BPMN that aims at creating a tailoring representation mechanism similar to the one found in SPEM 2.0.MethodWe have used the BPMN 2.0 extensibility mechanism to include the representation of specific tailoring relationships namely suppression, local contribution, and local replacement, which establish links between process elements (such as in the case of SPEM). Moreover, this paper also presents some rules to ensure the consistency of BPMN models when using tailoring relationships.ResultsIn order to evaluate our proposal we have implemented a tool to support the BPMNt approach and have applied it for representing real process adaptations in the context of an academic management system development project. Results of this study showed that the approach and its support tool can successfully be used to adapt BPMN-based software processes in real scenarios.ConclusionWe have proposed an approach to enable reuse and adaptation of BPMN-based software process models as well as derivation traceability between models through tailoring relationships. We believe that bringing such capabilities into BPMN will open new perspectives to software process management.  相似文献   

14.
This paper presents a modeling approach for the development of software for electronic control units in the automotive domain. The approach supports the development of two related architecture models in the overall development process: the logical architecture provides a graphical, quite abstract representation of a typically large set of automotive functions. On this abstraction level no design decisions are taken. The technical architecture provides a software and a hardware representation in separated views: the software architecture describes the software realization of functions as software components, whereas the hardware architecture models hardware ntities, on which the software components are deployed. Logical as well as technical architectures only model structural information, but no behavioural information. A tight integration of both architecture levels—on the conceptual and on the tool level—with related development phases such as requirements engineering, behaviour modeling, code generation as well as version and configuration management resulting in a seamless overall development process is presented. This architecture modeling approach has been developed within a safety-relevant project at BMW Group. Positive as well as negative experiences with the application of this approach are described.  相似文献   

15.
The evaluation of corporate financial distress has attracted significant global attention as a result of the increasing number of worldwide corporate failures. There is an immediate and compelling need for more effective financial distress prediction models. This paper presents a novel method to predict bankruptcy. The proposed method combines the partial least squares (PLS) based feature selection with support vector machine (SVM) for information fusion. PLS can successfully identify the complex nonlinearity and correlations among the financial indicators. The experimental results demonstrate its superior predictive ability. On the one hand, the proposed model can select the most relevant financial indicators to predict bankruptcy and at the same time identify the role of each variable in the prediction process. On the other hand, the proposed model’s high levels of prediction accuracy can translate into benefits to financial organizations through such activities as credit approval, and loan portfolio and security management.  相似文献   

16.
This paper is devoted to the establishment of reasonable levels of critical power equipment inventory using available historical data and strict legal requirements for reliable electrical power supply. Several models, including an Extreme Value Theory and a Homogeneous Poisson model, are used with results compared to provide a probabilistic formulation of suggested inventory. Several procedures are provided to establish proper inventory levels of critical power equipment. Simplifications are suggested for automating the calculation process for large samples. The presented approaches have a wide scope of industrial applicability, and can be applied to different types of critical equipment. Mathematical background is provided to assist practitioners that do not have extensive knowledge of probabilistic tools such as Extreme Value Theory or Homogeneous Poisson Process models.  相似文献   

17.
Congestion management for transmission control protocol is of utmost importance to prevent packet loss within a network. This necessitates strategies for active queue management. The most applied active queue management strategies have their inherent disadvantages which lead to suboptimal performance and even instability in the case of large round trip time and/or external disturbance. This paper presents an internal model control robust queue management scheme with two degrees of freedom in order to restrict the undesired effects of large and small round trip time and parameter variations in the queue management. Conventional approaches such as proportional integral and random early detection procedures lead to unstable behaviour due to large delay. Moreover, internal model control–Smith scheme suffers from large oscillations due to the large round trip time. On the other hand, other schemes such as internal model control–proportional integral and derivative show excessive sluggish performance for small round trip time values. To overcome these shortcomings, we introduce a system entailing two individual controllers for queue management and disturbance rejection, simultaneously. Simulation results based on Matlab/Simulink and also Network Simulator 2 (NS2) demonstrate the effectiveness of the procedure and verify the analytical approach.  相似文献   

18.
The application of Grid computing has been broadening day by day. An increasing number of users has led to the requirement of a job scheduling process, which can benefit them through optimizing their utility functions. On the other hand, resource providers are exploring strategies suitable for economically efficient resource allocation so that they can maximize their profit through satisfying more users. In such a scenario, economic-based resource management strategies (economic models) have been found to be compelling to satisfy both communities. However, existing research has identified that different economic models are suitable for different scenarios in Grid computing. The Grid application and resource models are typically very dynamic, making it challenging for a particular model for delivering stable performance all the time. In this work, our focus is to develop an adaptive resource management architecture capable of dealing with multiple models based on the models’ domains of strengths (DOS). Our preliminary results show promising outcomes if we consider multiple models rather than relying on a single model throughout the life cycle of a Grid.  相似文献   

19.
The paper considers the connection admission control (CAC), which is a key resource management procedure, and proposes a solution to the problem based on modelling and control methodologies. The CAC problem will be formulated as an optimal control problem subject to a set of constraints. As a matter of fact, the proposed controller, modelling the CAC mechanism, computes the above-mentioned control variables so that (i) a set of proper constraints, which model the quality of service (QoS) requirements (link availability, blocking probability and dropping probability), are respected and (ii) a proper performance index, which models the exploitation degree of the available bandwidth, is maximized. The proposed CAC successfully compares with other CACs proposed in the literature, and in particular significantly extends the upper limit of the accepted traffic rate.  相似文献   

20.
On the one hand, data models decrease the complexity of information system development. On the other hand, data models causes additional complexity. Recently structural analogies are discussed as instruments reducing the complexity of data models. This piece of research presents a procedure to identify structural analogies in data models and demonstrates its performance by analyzing Scheer’s reference model for industrial enterprises (Y-CIM-model). The proposed procedure is based on formalizing data models within set theory and uses a quantitative similarity measure. The obtained results show both identical and very similar information structures within the Y-CIM-model. Furthermore, ways of dealing with the identified structural analogies are discussed from an analysis and software design perspective.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号