首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
Process mining can be seen as the “missing link” between data mining and business process management. The lion's share of process mining research has been devoted to the discovery of procedural process models from event logs. However, often there are predefined constraints that (partially) describe the normative or expected process, e.g., “activity A should be followed by B” or “activities A and B should never be both executed”. A collection of such constraints is called a declarative process model. Although it is possible to discover such models based on event data, this paper focuses on aligning event logs and predefined declarative process models. Discrepancies between log and model are mediated such that observed log traces are related to paths in the model. The resulting alignments provide sophisticated diagnostics that pinpoint where deviations occur and how severe they are. Moreover, selected parts of the declarative process model can be used to clean and repair the event log before applying other process mining techniques. Our alignment-based approach for preprocessing and conformance checking using declarative process models has been implemented in ProM and has been evaluated using both synthetic logs and real-life logs from a Dutch hospital.  相似文献   

2.
The practical relevance of process mining is increasing as more and more event data become available. Process mining techniques aim to discover, monitor and improve real processes by extracting knowledge from event logs. The two most prominent process mining tasks are: (i) process discovery: learning a process model from example behavior recorded in an event log, and (ii) conformance checking: diagnosing and quantifying discrepancies between observed behavior and modeled behavior. The increasing volume of event data provides both opportunities and challenges for process mining. Existing process mining techniques have problems dealing with large event logs referring to many different activities. Therefore, we propose a generic approach to decompose process mining problems. The decomposition approach is generic and can be combined with different existing process discovery and conformance checking techniques. It is possible to split computationally challenging process mining problems into many smaller problems that can be analyzed easily and whose results can be combined into solutions for the original problems.  相似文献   

3.
Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Process models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior. Existing process mining techniques can only uncover these differences, but the actual repair of the model is left to the user and is not supported. In this paper we investigate the problem of repairing a process model w.r.t. a log such that the resulting model can replay the log (i.e., conforms to it) and is as similar as possible to the original model. To solve the problem, we use an existing conformance checker that aligns the runs of the given process model to the traces in the log. Based on this information, we decompose the log into several sublogs of non-fitting subtraces. For each sublog, either a loop is discovered that can replay the sublog or a subprocess is derived that is then added to the original model at the appropriate location. The approach is implemented in the process mining toolkit ProM and has been validated on logs and models from several Dutch municipalities.  相似文献   

4.
An exponential growth of event data can be witnessed across all industries. Devices connected to the internet (internet of things), social interaction, mobile computing, and cloud computing provide new sources of event data and this trend will continue. The omnipresence of large amounts of event data is an important enabler for process mining. Process mining techniques can be used to discover, monitor and improve real processes by extracting knowledge from observed behavior. However, unprecedented volumes of event data also provide new challenges and often state-of-the-art process mining techniques cannot cope. This paper focuses on “conformance checking in the large” and presents a novel decomposition technique that partitions larger process models and event logs into smaller parts that can be analyzed independently. The so-called Single-Entry Single-Exit (SESE) decomposition not only helps to speed up conformance checking, but also provides improved diagnostics. The analyst can zoom in on the problematic parts of the process. Importantly, the conditions under which the conformance of the whole can be assessed by verifying the conformance of the SESE parts are described, which enables the decomposition and distribution of large conformance checking problems. All the techniques have been implemented in ProM, and experimental results are provided.  相似文献   

5.
There seems to be a never ending stream of new process modeling notations. Some of these notations are foundational and have been around for decades (e.g., Petri nets). Other notations are vendor specific, incremental, or are only popular for a short while. Discussions on the various competing notations concealed the more important question “What makes a good process model?”. Fortunately, large scale experiences with process mining allow us to address this question. Process mining techniques can be used to extract knowledge from event data, discover models, align logs and models, measure conformance, diagnose bottlenecks, and predict future events. Today’s processes leave many trails in data bases, audit trails, message logs, transaction logs, etc. Therefore, it makes sense to relate these event data to process models independent of their particular notation. Process models discovered based on the actual behavior tend to be very different from the process models made by humans. Moreover, conformance checking techniques often reveal important deviations between models and reality. The lessons that can be learned from process mining shed a new light on process model quality. This paper discusses the role of process models and lists seven problems related to process modeling. Based on our experiences in over 100 process mining projects, we discuss these problems. Moreover, we show that these problems can be addressed by exposing process models and modelers to event data.  相似文献   

6.
Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining techniques used to discover process models from event logs, find log and model deviations, and analyze performance characteristics of processes. The representational bias (a way to model processes) plays an important role in process mining. The BPMN 2.0 (Business Process Model and Notation) standard is widely used and allows to build conventional and understandable process models. In addition to the flat control flow perspective, subprocesses, data flows, resources can be integrated within one BPMN diagram. This makes BPMN very attractive for both process miners and business users, since the control flow perspective can be integrated with data and resource perspectives discovered from event logs. In this paper, we describe and justify robust control flow conversion algorithms, which provide the basis for more advanced BPMN-based discovery and conformance checking algorithms. Thus, on the basis of these conversion algorithms low-level models (such as Petri nets, causal nets and process trees) discovered from event logs using existing approaches can be represented in terms of BPMN. Moreover, we establish behavioral relations between Petri nets and BPMN models and use them to adopt existing conformance checking and performance analysis techniques in order to visualize conformance and performance information within a BPMN diagram. We believe that the results presented in this paper can be used for a wide variety of BPMN mining and conformance checking algorithms. We also provide metrics for the processes discovered before and after the conversion to BPMN structures. Cases for which conversion algorithms produce more compact or more complicated BPMN models in comparison with the initial models are identified.  相似文献   

7.
Process mining techniques allow for extracting information from event logs. For example, the audit trails of a workflow management system or the transaction logs of an enterprise resource planning system can be used to discover models describing processes, organizations, and products. Traditionally, process mining has been applied to structured processes. In this paper, we argue that process mining can also be applied to less structured processes supported by computer supported cooperative work (CSCW) systems. In addition, the ProM framework is described. Using ProM a wide variety of process mining activities are supported ranging from process discovery and verification to conformance checking and social network analysis.  相似文献   

8.
9.
A business process (BP) consists of a set of activities which are performed in coordination in an organizational and technical environment and which jointly realize a business goal. In such context, BP management (BPM) can be seen as supporting BPs using methods, techniques, and software in order to design, enact, control, and analyze operational processes involving humans, organizations, applications, and other sources of information. Since the accurate management of BPs is receiving increasing attention, conformance checking, i.e., verifying whether the observed behavior matches a modelled behavior, is becoming more and more critical. Moreover, declarative languages are more frequently used to provide an increased flexibility. However, whereas there exist solid conformance checking techniques for imperative models, little work has been conducted for declarative models. Furthermore, only control-flow perspective is usually considered although other perspectives (e.g., data) are crucial. In addition, most approaches exclusively check the conformance without providing any related diagnostics. To enhance the accurate management of flexible BPs, this work presents a constraint-based approach for conformance checking over declarative BP models (including both control-flow and data perspectives). In addition, two constraint-based proposals for providing related diagnosis are detailed. To demonstrate both the effectiveness and the efficiency of the proposed approaches, the analysis of different performance measures related to a wide diversified set of test models of varying complexity has been performed.  相似文献   

10.
Nowadays, business processes are increasingly supported by IT services that produce massive amounts of event data during the execution of a process. These event data can be used to analyze the process using process mining techniques to discover the real process, measure conformance to a given process model, or to enhance existing models with performance information. Mapping the produced events to activities of a given process model is essential for conformance checking, annotation and understanding of process mining results. In order to accomplish this mapping with low manual effort, we developed a semi-automatic approach that maps events to activities using insights from behavioral analysis and label analysis. The approach extracts Declare constraints from both the log and the model to build matching constraints to efficiently reduce the number of possible mappings. These mappings are further reduced using techniques from natural language processing, which allow for a matching based on labels and external knowledge sources. The evaluation with synthetic and real-life data demonstrates the effectiveness of the approach and its robustness toward non-conforming execution logs.  相似文献   

11.
Over the past decade process mining has emerged as a new analytical discipline able to answer a variety of questions based on event data. Event logs have a very particular structure; events have timestamps, refer to activities and resources, and need to be correlated to form process instances. Process mining results tend to be very different from classical data mining results, e.g., process discovery may yield end-to-end process models capturing different perspectives rather than decision trees or frequent patterns. A process-mining tool like ProM provides hundreds of different process mining techniques ranging from discovery and conformance checking to filtering and prediction. Typically, a combination of techniques is needed and, for every step, there are different techniques that may be very sensitive to parameter settings. Moreover, event logs may be huge and may need to be decomposed and distributed for analysis. These aspects make it very cumbersome to analyze event logs manually. Process mining should be repeatable and automated. Therefore, we propose a framework to support the analysis of process mining workflows. Existing scientific workflow systems and data mining tools are not tailored towards process mining and the artifacts used for analysis (process models and event logs). This paper structures the basic building blocks needed for process mining and describes various analysis scenarios. Based on these requirements we implemented RapidProM, a tool supporting scientific workflows for process mining. Examples illustrating the different scenarios are provided to show the feasibility of the approach.  相似文献   

12.
Business processes leave trails in a variety of data sources (e.g., audit trails, databases, and transaction logs). Hence, every process instance can be described by a trace, i.e., a sequence of events. Process mining techniques are able to extract knowledge from such traces and provide a welcome extension to the repertoire of business process analysis techniques. Recently, process mining techniques have been adopted in various commercial BPM systems (e.g., BPM|one, Futura Reflect, ARIS PPM, Fujitsu Interstage, Businesscape, Iontas PDF, and QPR PA). Unfortunately, traditional process discovery algorithms have problems dealing with less structured processes. The resulting models are difficult to comprehend or even misleading. Therefore, we propose a new approach based on trace alignment. The goal is to align traces in such a way that event logs can be explored easily. Trace alignment can be used to explore the process in the early stages of analysis and to answer specific questions in later stages of analysis. Hence, it complements existing process mining techniques focusing on discovery and conformance checking. The proposed techniques have been implemented as plugins in the ProM framework. We report the results of trace alignment on one synthetic and two real-life event logs, and show that trace alignment has significant promise in process diagnostic efforts.  相似文献   

13.
Process mining is the research domain that is dedicated to the a posteriori analysis of business process executions. The techniques developed within this research area are specifically designed to provide profound insight by exploiting the untapped reservoir of knowledge that resides within event logs of information systems. Process discovery is one specific subdomain of process mining that entails the discovery of control-flow models from such event logs. Assessing the quality of discovered process models is an essential element, both for conducting process mining research as well as for the use of process mining in practice. In this paper, a multi-dimensional quality assessment is presented in order to comprehensively evaluate process discovery techniques. In contrast to previous studies, the major contribution of this paper is the use of eight real-life event logs. For instance, we show that evaluation based on real-life event logs significantly differs from the traditional approach to assess process discovery techniques using artificial event logs. In addition, we provide an extensive overview of available process discovery techniques and we describe how discovered process models can be assessed regarding both accuracy and comprehensibility. The results of our study indicate that the HeuristicsMiner algorithm is especially suited in a real-life setting. However, it is also shown that, particularly for highly complex event logs, knowledge discovery from such data sets can become a major problem for traditional process discovery techniques.  相似文献   

14.
Traditional process mining techniques offer limited possibilities to analyze business processes working in low-predictable and dynamic environments. Recently, to close this gap, declarative process models have been introduced to represent process mining results since they allow for describing complex behaviors as a compact set of business rules. However, in this context, activities of a business process are still considered as atomic/instantaneous events. This is a strong limitation for these approaches because often, in realistic environments, process activities are not instantaneous but executed across a time interval and pass through a sequence of states of a lifecycle. This paper investigates how the existing techniques for the discovery of declarative process models can be adapted when the business process under analysis contains non-atomic activities. In particular, we base our proposed approach on the use of discriminative rule mining to determine how the characteristics of the activity lifecycles in a business process influence the validity of a business rule in that process. The approach has been implemented as a plug-in of the process mining tool ProM and validated on synthetic logs and on a real-life log recorded by an incident and problem management system called VINST in use at Volvo IT Belgium.  相似文献   

15.
Given a model of the expected behavior of a business process and given an event log recording its observed behavior, the problem of business process conformance checking is that of identifying and describing the differences between the process model and the event log. A desirable feature of a conformance checking technique is that it should identify a minimal yet complete set of differences. Existing conformance checking techniques that fulfill this property exhibit limited scalability when confronted to large and complex process models and event logs. One reason for this limitation is that existing techniques compare each execution trace in the log against the process model separately, without reusing computations made for one trace when processing subsequent traces. Yet, the execution traces of a business process typically share common fragments (e.g. prefixes and suffixes). A second reason is that these techniques do not integrate mechanisms to tackle the combinatorial state explosion inherent to process models with high levels of concurrency. This paper presents two techniques that address these sources of inefficiency. The first technique starts by transforming the process model and the event log into two automata. These automata are then compared based on a synchronized product, which is computed using an A* heuristic with an admissible heuristic function, thus guaranteeing that the resulting synchronized product captures all differences and is minimal in size. The synchronized product is then used to extract optimal (minimal-length) alignments between each trace of the log and the closest corresponding trace of the model. By representing the event log as a single automaton, this technique allows computations for shared prefixes and suffixes to be made only once. The second technique decomposes the process model into a set of automata, known as S-components, such that the product of these automata is equal to the automaton of the whole process model. A product automaton is computed for each S-component separately. The resulting product automata are then recomposed into a single product automaton capturing all the differences between the process model and the event log, but without minimality guarantees. An empirical evaluation using 40 real-life event logs shows that, used in tandem, the proposed techniques outperform state-of-the-art baselines in terms of execution times in a vast majority of cases, with improvements ranging from several-fold to one order of magnitude. Moreover, the decomposition-based technique leads to optimal trace alignments for the vast majority of datasets and close to optimal alignments for the remaining ones.  相似文献   

16.
Service processes, for example in transportation, telecommunications or the health sector, are the backbone of today׳s economies. Conceptual models of service processes enable operational analysis that supports, e.g., resource provisioning or delay prediction. In the presence of event logs containing recorded traces of process execution, such operational models can be mined automatically.In this work, we target the analysis of resource-driven, scheduled processes based on event logs. We focus on processes for which there exists a pre-defined assignment of activity instances to resources that execute activities. Specifically, we approach the questions of conformance checking (how to assess the conformance of the schedule and the actual process execution) and performance improvement (how to improve the operational process performance). The first question is addressed based on a queueing network for both the schedule and the actual process execution. Based on these models, we detect operational deviations and then apply statistical inference and similarity measures to validate the scheduling assumptions, thereby identifying root-causes for these deviations. These results are the starting point for our technique to improve the operational performance. It suggests adaptations of the scheduling policy of the service process to decrease the tardiness (non-punctuality) and lower the flow time. We demonstrate the value of our approach based on a real-world dataset comprising clinical pathways of an outpatient clinic that have been recorded by a real-time location system (RTLS). Our results indicate that the presented technique enables localization of operational bottlenecks along with their root-causes, while our improvement technique yields a decrease in median tardiness and flow time by more than 20%.  相似文献   

17.
Many companies have adopted Process-aware Information Systems (PAIS) to support their business processes in some form. On the one hand these systems typically log events (e.g., in transaction logs or audit trails) related to the actual business process executions. On the other hand explicit process models describing how the business process should (or is expected to) be executed are frequently available. Together with the data recorded in the log, this situation raises the interesting question “Do the model and the log conform to each other?”. Conformance checking, also referred to as conformance analysis, aims at the detection of inconsistencies between a process model and its corresponding execution log, and their quantification by the formation of metrics. This paper proposes an incremental approach to check the conformance of a process model and an event log. First of all, the fitness between the log and the model is measured (i.e., “Does the observed process comply with the control flow specified by the process model?”). Second, the appropriateness of the model can be analyzed with respect to the log (i.e., “Does the model describe the observed process in a suitable way?”). Appropriateness can be evaluated from both a structural and a behavioral perspective. To operationalize the ideas presented in this paper a Conformance Checker has been implemented within the ProM framework, and it has been evaluated using artificial and real-life event logs.  相似文献   

18.
使用事件日志进行符合性检查的主要方法是:使用过程模型模拟执行事件日志中的任务序列,通过统计可被模型再现的任务序列及模型运行中可能触发的非运行序列中的任务个数,判断模型与日志的符合程度.但这种判断方法并不完备:如果模型中包含大量选择结构,则即使日志是模型本身的日志,也会因为模拟执行较多任务时会触发当前序列外的其他任务,而误判日志与模型的符合性较低;或者,如果模型中只包含少数的并发结构和多数的顺序结构,则即使日志只包含顺序结构的内容且非该模型对应日志时,也会因为在模拟执行时只有个别任务会导致模型无法继续执行,而其他多数任务可以执行而误判日志与模型有较高的符合性.基于已有方法的弱点,提出了使用日志内容检查模型结构正确性与使用模型结构检查日志内容完整性的双向检查标准,并提出一种内容特征与模型结构特征一一对应的新型日志——Token Log,用于过程模型与系统日志的符合性检查,使得检查和判断过程更加清晰简洁,结果更加准确.  相似文献   

19.
An automated process discovery technique generates a process model from an event log recording the execution of a business process. For it to be useful, the generated process model should be as simple as possible, while accurately capturing the behavior recorded in, and implied by, the event log. Most existing automated process discovery techniques generate flat process models. When confronted to large event logs, these approaches lead to overly complex or inaccurate process models. An alternative is to apply a divide-and-conquer approach by decomposing the process into stages and discovering one model per stage. It turns out, however, that existing divide-and-conquer process discovery approaches often produce less accurate models than flat discovery techniques, when applied to real-life event logs. This article proposes an automated method to identify business process stages from an event log and an automated technique to discover process models based on a given stage-based process decomposition. An experimental evaluation shows that: (i) relative to existing automated process decomposition methods in the field of process mining, the proposed method leads to stage-based decompositions that are closer to decompositions derived by human experts; and (ii) the proposed stage-based process discovery technique outperforms existing flat and divide-and-conquer discovery techniques with respect to well-accepted measures of accuracy and achieves comparable results in terms of model complexity.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号