首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Contemporary information systems (e.g., WfM, ERP, CRM, SCM, and B2B systems) record business events in so-called event logs. Business process mining takes these logs to discover process, control, data, organizational, and social structures. Although many researchers are developing new and more powerful process mining techniques and software vendors are incorporating these in their software, few of the more advanced process mining techniques have been tested on real-life processes. This paper describes the application of process mining in one of the provincial offices of the Dutch National Public Works Department, responsible for the construction and maintenance of the road and water infrastructure. Using a variety of process mining techniques, we analyzed the processing of invoices sent by the various subcontractors and suppliers from three different perspectives: (1) the process perspective, (2) the organizational perspective, and (3) the case perspective. For this purpose, we used some of the tools developed in the context of the ProM framework. The goal of this paper is to demonstrate the applicability of process mining in general and our algorithms and tools in particular.  相似文献   

2.
Performing business process analysis in healthcare organizations is particularly difficult due to the highly dynamic, complex, ad hoc, and multi-disciplinary nature of healthcare processes. Process mining is a promising approach to obtain a better understanding about those processes by analyzing event data recorded in healthcare information systems. However, not all process mining techniques perform well in capturing the complex and ad hoc nature of clinical workflows. In this work we introduce a methodology for the application of process mining techniques that leads to the identification of regular behavior, process variants, and exceptional medical cases. The approach is demonstrated in a case study conducted at a hospital emergency service. For this purpose, we implemented the methodology in a tool that integrates the main stages of process analysis. The tool is specific to the case study, but the same methodology can be used in other healthcare environments.  相似文献   

3.
It is increasingly common to see computer-based simulation being used as a vehicle to model and analyze business processes in relation to process management and improvement. While there are a number of business process management (BPM) and business process simulation (BPS) methodologies, approaches and tools available, it is more desirable to have a systemic BPS approach for operational decision support, from constructing process models based on historical data to simulating processes for typical and common problems. In this paper, we have proposed a generic approach of BPS for operational decision support which includes business processes modeling and workflow simulation with the models generated. Processes are modeled with event graphs through process mining from workflow logs that have integrated comprehensive information about the control-flow, data and resource aspects of a business process. A case study of a credit card application is presented to illustrate the steps involved in constructing an event graph. The evaluation detail is also given in terms of precision, generalization and robustness. Based on the event graph model constructed, we simulate the process under different scenarios and analyze the simulation logs for three generic problems in the case study: 1) suitable resource allocation plan for different case arrival rates; 2) teamwork performance under different case arrival rates; and 3) evaluation and prediction for personal performances. Our experimental results show that the proposed approach is able to model business processes using event graphs and simulate the processes for common operational decision support which collectively play an important role in process management and improvement.  相似文献   

4.
Process mining allows for the automated discovery of process models from event logs. These models provide insights and enable various types of model-based analysis. This paper demonstrates that the discovered process models can be extended with information to predict the completion time of running instances. There are many scenarios where it is useful to have reliable time predictions. For example, when a customer phones her insurance company for information about her insurance claim, she can be given an estimate for the remaining processing time. In order to do this, we provide a configurable approach to construct a process model, augment this model with time information learned from earlier instances, and use this to predict e.g., the completion time. To provide meaningful time predictions we use a configurable set of abstractions that allow for a good balance between “overfitting” and “underfitting”. The approach has been implemented in ProM and through several experiments using real-life event logs we demonstrate its applicability.  相似文献   

5.
Process mining techniques allow for extracting information from event logs. For example, the audit trails of a workflow management system or the transaction logs of an enterprise resource planning system can be used to discover models describing processes, organizations, and products. Traditionally, process mining has been applied to structured processes. In this paper, we argue that process mining can also be applied to less structured processes supported by computer supported cooperative work (CSCW) systems. In addition, the ProM framework is described. Using ProM a wide variety of process mining activities are supported ranging from process discovery and verification to conformance checking and social network analysis.  相似文献   

6.
A continuous evolution of business process parameters, constraints and needs, hardly foreseeable initially, requires a continuous design from the business process management systems. In this article we are interested in developing a reactive design through process log analysis ensuring process re-engineering and execution reliability. We propose to analyse workflow logs to discover workflow transactional behaviour and to subsequently improve and correct related recovery mechanisms. Our approach starts by collecting workflow logs. Then, we build, by statistical analysis techniques, an intermediate representation specifying elementary dependencies between activities. These dependencies are refined to mine the transactional workflow model. The analysis of the discrepancies between the discovered model and the initially designed model enables us to detect design gaps, concerning particularly the recovery mechanisms. Thus, based on this mining step, we apply a set of rules on the initially designed workflow to improve workflow reliability. The work presented in this paper was partially supported by the EU under the SUPER project (FP6-026850) and by the Lion project supported by Science Foundation Ireland under Grant No. SFI/02/CE1/I131.  相似文献   

7.
In an inter-organizational setting the manual construction of process models is challenging because the different people involved have to put together their partial knowledge about the overall process. Process mining, an automated technique to discover and analyze process models, can facilitate the construction of inter-organizational process models. This paper presents a technique to merge the input data of the different partners of an inter-organizational process in order to serve as input for process mining algorithms. The technique consists of a method for configuring and executing the merge and an algorithm that searches for links between the data of the different partners and that suggests rules to the user on how to merge the data. Tool support is provided in the open source process mining framework ProM. The method and the algorithm are tested using two artificial and three real life datasets that confirm their effectiveness and efficiency.  相似文献   

8.
Most private and public organizations have recently turned their attention to the process by which they operate, to improve service and product quality and customer satisfaction. To support business process reengineering, methods and tools for process modeling and analysis are required. The paper presents the ARTEMIS methodology and associated tool environment for business process analysis for reengineering. In the ARTEMIS methodological framework, business processes are modeled as workflows and are analyzed according to an organizational structure perspective and an operational structure perspective. With these two perspectives, the analyst can plan reengineering interventions based on the degree of autonomy/dependency of organization units in terms of coupling, and the inter-process semantic correspondences, in terms of data and operation similarity, respectively. The ARTEMIS methodology and associated tool environment have been conceived and applied in the framework of the PROGRESS research project. In the paper, we report on a reengineering case study of this project involving the Italian Ministry of Justice.  相似文献   

9.
A business process (BP) consists of a set of activities which are performed in coordination in an organizational and technical environment and which jointly realize a business goal. In such context, BP management (BPM) can be seen as supporting BPs using methods, techniques, and software in order to design, enact, control, and analyze operational processes involving humans, organizations, applications, and other sources of information. Since the accurate management of BPs is receiving increasing attention, conformance checking, i.e., verifying whether the observed behavior matches a modelled behavior, is becoming more and more critical. Moreover, declarative languages are more frequently used to provide an increased flexibility. However, whereas there exist solid conformance checking techniques for imperative models, little work has been conducted for declarative models. Furthermore, only control-flow perspective is usually considered although other perspectives (e.g., data) are crucial. In addition, most approaches exclusively check the conformance without providing any related diagnostics. To enhance the accurate management of flexible BPs, this work presents a constraint-based approach for conformance checking over declarative BP models (including both control-flow and data perspectives). In addition, two constraint-based proposals for providing related diagnosis are detailed. To demonstrate both the effectiveness and the efficiency of the proposed approaches, the analysis of different performance measures related to a wide diversified set of test models of varying complexity has been performed.  相似文献   

10.
A fundamental assumption of Business Process Management (BPM) is that redesign delivers refined and improved versions of business processes. This assumption, however, does not necessarily hold, and any required compensatory action may be delayed until a new round in the BPM life-cycle completes. Current approaches to process redesign face this problem in one way or another, which makes rapid process improvement a central research problem of BPM today. In this paper, we address this problem by integrating concepts from process execution with ideas from DevOps. More specifically, we develop a methodology called AB-BPM that offers process improvement validation in two phases: simulation and AB tests. Our simulation technique extracts decision probabilities and metrics from the event log of an existing process version and generates traces for the new process version based on this knowledge. The results of simulation guide us towards AB testing where two versions (A and B) are operational in parallel and any new process instance is routed to one of them. The routing decision is made at runtime on the basis of the achieved results for the registered performance metrics of each version. Our routing algorithm provides for ultimate convergence towards the best performing version, no matter if it is the old or the new version. We demonstrate the efficacy of our methodology and techniques by conducting an extensive evaluation based on both synthetic and real-life data.  相似文献   

11.
Business processes leave trails in a variety of data sources (e.g., audit trails, databases, and transaction logs). Hence, every process instance can be described by a trace, i.e., a sequence of events. Process mining techniques are able to extract knowledge from such traces and provide a welcome extension to the repertoire of business process analysis techniques. Recently, process mining techniques have been adopted in various commercial BPM systems (e.g., BPM|one, Futura Reflect, ARIS PPM, Fujitsu Interstage, Businesscape, Iontas PDF, and QPR PA). Unfortunately, traditional process discovery algorithms have problems dealing with less structured processes. The resulting models are difficult to comprehend or even misleading. Therefore, we propose a new approach based on trace alignment. The goal is to align traces in such a way that event logs can be explored easily. Trace alignment can be used to explore the process in the early stages of analysis and to answer specific questions in later stages of analysis. Hence, it complements existing process mining techniques focusing on discovery and conformance checking. The proposed techniques have been implemented as plugins in the ProM framework. We report the results of trace alignment on one synthetic and two real-life event logs, and show that trace alignment has significant promise in process diagnostic efforts.  相似文献   

12.
Evaluating workflow process designs using cohesion and coupling metrics   总被引:1,自引:0,他引:1  
Irene  Hajo A.  Wil M.P.   《Computers in Industry》2008,59(5):420-437
  相似文献   

13.
Given a model of the expected behavior of a business process and given an event log recording its observed behavior, the problem of business process conformance checking is that of identifying and describing the differences between the process model and the event log. A desirable feature of a conformance checking technique is that it should identify a minimal yet complete set of differences. Existing conformance checking techniques that fulfill this property exhibit limited scalability when confronted to large and complex process models and event logs. One reason for this limitation is that existing techniques compare each execution trace in the log against the process model separately, without reusing computations made for one trace when processing subsequent traces. Yet, the execution traces of a business process typically share common fragments (e.g. prefixes and suffixes). A second reason is that these techniques do not integrate mechanisms to tackle the combinatorial state explosion inherent to process models with high levels of concurrency. This paper presents two techniques that address these sources of inefficiency. The first technique starts by transforming the process model and the event log into two automata. These automata are then compared based on a synchronized product, which is computed using an A* heuristic with an admissible heuristic function, thus guaranteeing that the resulting synchronized product captures all differences and is minimal in size. The synchronized product is then used to extract optimal (minimal-length) alignments between each trace of the log and the closest corresponding trace of the model. By representing the event log as a single automaton, this technique allows computations for shared prefixes and suffixes to be made only once. The second technique decomposes the process model into a set of automata, known as S-components, such that the product of these automata is equal to the automaton of the whole process model. A product automaton is computed for each S-component separately. The resulting product automata are then recomposed into a single product automaton capturing all the differences between the process model and the event log, but without minimality guarantees. An empirical evaluation using 40 real-life event logs shows that, used in tandem, the proposed techniques outperform state-of-the-art baselines in terms of execution times in a vast majority of cases, with improvements ranging from several-fold to one order of magnitude. Moreover, the decomposition-based technique leads to optimal trace alignments for the vast majority of datasets and close to optimal alignments for the remaining ones.  相似文献   

14.
15.
In the domain of supply chain management (SCM), various software packages have been developed for planning business strategies. To solve the problem of system productivity in applying planning packages, we propose a solution concept, business process integration (BPI), which fuses workflow and enterprise application integration (EAI) technology. Two characteristic policies are included in BPI. The first is to design the minimum set of business processes for real-time information sharing with planning packages without changing other processes. The second is to integrate several systems with EAI technology and to manage their execution with a workflow tool. Based on these policies, we propose various design templates and integration adapters. Our evaluation shows that using BPI, a target system can be developed with less manpower, in less time, and with higher quality than previous methods.  相似文献   

16.
The standardization of processes and the identification of shared business services in a service-oriented architecture (SOA) are currently widely discussed. Above all in practice, however, there still is a lack of appropriate instruments to support these tasks. In this paper an approach for a process map is introduced which allows for a systematic presentation—as complete as possible—of the processes in an enterprise (division). After a consistent refinement of the process has taken place by means of aggregation/disaggregation respectively, generalization/specialization relations, it is possible to identify primarily functional similarities of the detailed sub-processes. The application of the process map at a financial service provider (FSP) highlights how these similarities can be taken as a basis to standardize processes and to identify shared services.  相似文献   

17.
A key aspect in any process-oriented organisation is the evaluation of process performance for the achievement of its strategic and operational goals. Process Performance Indicators (PPIs) are a key asset to carry out this evaluation, and, therefore, having an appropriate definition of these PPIs is crucial. After a careful review of the literature related and a study of the current picture in different real organisations, we conclude that there not exists any proposal that allows to define PPIs in a way that is unambiguous and highly expressive, understandable by technical and non-technical users and traceable with the Business Process (BP). In addition, like other activities carried out during the BP lifecycle, the management of PPIs is considered time-consuming and error-prone. Therefore, providing an automated support for them is very appealing from a practical point of view.  相似文献   

18.
The Office Document Architecture (ODA) is an International Standard which is developed by TC 97/sc 18 of the International Organization for Standardization (ISO) in close collaboration with CCITT's Study Group VIII and with ECMA. This paper describes the current state of a formal specification of the ODA document structures by mathematical means and its use for conformance specification and conformance testing.  相似文献   

19.
It is common for large organizations to maintain repositories of business process models in order to document and to continuously improve their operations. Given such a repository, this paper deals with the problem of retrieving those models in the repository that most closely resemble a given process model or fragment thereof. Up to now, there is a notable research gap on comparing different approaches to this problem and on evaluating them in the same setting. Therefore, this paper presents three similarity metrics that can be used to answer queries on process repositories: (i) node matching similarity that compares the labels and attributes attached to process model elements; (ii) structural similarity that compares element labels as well as the topology of process models; and (iii) behavioral similarity that compares element labels as well as causal relations captured in the process model. These metrics are experimentally evaluated in terms of precision and recall. The results show that all three metrics yield comparable results, with structural similarity slightly outperforming the other two metrics. Also, all three metrics outperform text-based search engines when it comes to searching through a repository for similar business process models.  相似文献   

20.
This paper introduces the ideas behind BPML, the business process modelling language published by BPMI. BPML provides a process-centric (as opposed to a datacentric) metalanguage and execution model for business systems. It is underpinned by a strong mathematical foundation, the pi-calculus. The current paper is derived from supplementary appendices to a book which describes a ‘third wave’ approach to business process management [Business Process Management: The Third Wave, 2003]. The aim is to model business processes directly in an executable form, so that the mobility and mutability inherent in business behaviour is reflected and supported in the corresponding IT systems, erasing the present IT-business divide.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号