共查询到20条相似文献,搜索用时 0 毫秒
1.
Business alignment: using process mining as a tool for Delta analysis and conformance testing 总被引:1,自引:0,他引:1
W. M. P. van der Aalst 《Requirements Engineering》2005,10(3):198-211
Increasingly, business processes are being controlled and/or monitored by information systems. As a result, many business
processes leave their “footprints” in transactional information systems, i.e., business events are recorded in so-called event
logs. Process mining aims at improving this by providing techniques and tools for discovering process, control, data, organizational,
and social structures from event logs, i.e., the basic idea of process mining is to diagnose business processes by mining
event logs for knowledge. In this paper we focus on the potential use of process mining for measuring business alignment, i.e., comparing the real behavior of an information system or its users with the intended or expected behavior. We identify
two ways to create and/or maintain the fit between business processes and supporting information systems: Delta analysis and conformance testing. Delta analysis compares the discovered model (i.e., an abstraction derived from the actual process) with some predefined
processes model (e.g., the workflow model or reference model used to configure the system). Conformance testing attempts to
quantify the “fit” between the event log and some predefined processes model. In this paper, we show that Delta analysis and
conformance testing can be used to analyze business alignment as long as the actual events are logged and users have some
control over the process.
相似文献
W. M. P. van der AalstEmail: |
2.
Process analytical technologies and real time process control a review of some spectroscopic issues and challenges 总被引:1,自引:0,他引:1
Process analytical technologies (PAT) are increasingly being explored and adopted by pharmaceutical and industrial biotechnology companies for enhanced process understanding, Quality by Design (QbD) and Real Time Release (RTR). To achieve these aspirations there is a critical need to extract the most information, and hence understanding, from complex and often ‘messy’ spectroscopic data. This contribution reviews a number of new approaches that have been shown to overcome the limitations of existing calibration/modelling methodologies and describes a practical system which would enhance robustness of the closed loop process control system and overall ‘control strategy’. Application studies are described of the use of on-line spectroscopy for the monitoring and control of a downstream solvent recovery column, batch cooling crystallization and pharmaceutical fermentation. 相似文献
3.
Web服务的商业流程执行语言(简称BPEL4WS或BPEL)是一种基于XML的工作流定义语言,可以作为企业工作流建模和实现工作流管理系统的基础.本文首先介绍了工作流和BPEL4WS的基本概念,然后由一个例子具体介绍了BPEL4W的流程,最后给出了基于BPEL4WS工作流管理系统的实现. 相似文献
4.
Daniel Méndez Fernández Stefan WagnerKlaus Lochmann Andrea BaumannHolger de Carne 《Information and Software Technology》2012,54(2):162-178
Context
Requirements Engineering (RE) is a critical discipline mostly driven by uncertainty, since it is influenced by the customer domain or by the development process model used. Volatile project environments restrict the choice of methods and the decision about which artefacts to produce in RE.Objective
We aim to investigate RE processes in successful project environments to discover characteristics and strategies that allow us to elaborate RE tailoring approaches in the future.Method
We perform a field study on a set of projects at one company. First, we investigate by content analysis which RE artefacts were produced in each project and to what extent they were produced. Second, we perform qualitative analysis of semi-structured interviews to discover project parameters that relate to the produced artefacts. Third, we use cluster analysis to infer artefact patterns and probable RE execution strategies, which are the responses to specific project parameters. Fourth, we investigate by statistical tests the effort spent in each strategy in relation to the effort spent in change requests to evaluate the efficiency of execution strategies.Results
We identified three artefact patterns and corresponding execution strategies. Each strategy covers different project parameters that impact the creation of certain artefacts. The effort analysis shows that the strategies have no significant differences in their effort and efficiency.Conclusions
In contrast to our initial assumption that an increased effort in requirements engineering lowers the probability of change requests or project failures in general, our results show no statistically significant difference between the efficiency of the strategies. In addition, it turned out that many parameters considered as the main causes for project failures can be successfully handled. Hence, practitioners can apply the artefact patterns and related project parameters to tailor the RE process according to individual project characteristics. 相似文献5.
Walid Gaaloul Khaled Gaaloul Sami Bhiri Armin Haller Manfred Hauswirth 《Distributed and Parallel Databases》2009,25(3):193-240
A continuous evolution of business process parameters, constraints and needs, hardly foreseeable initially, requires a continuous
design from the business process management systems. In this article we are interested in developing a reactive design through
process log analysis ensuring process re-engineering and execution reliability. We propose to analyse workflow logs to discover
workflow transactional behaviour and to subsequently improve and correct related recovery mechanisms. Our approach starts
by collecting workflow logs. Then, we build, by statistical analysis techniques, an intermediate representation specifying
elementary dependencies between activities. These dependencies are refined to mine the transactional workflow model. The analysis
of the discrepancies between the discovered model and the initially designed model enables us to detect design gaps, concerning
particularly the recovery mechanisms. Thus, based on this mining step, we apply a set of rules on the initially designed workflow
to improve workflow reliability.
The work presented in this paper was partially supported by the EU under the SUPER project (FP6-026850) and by the Lion project
supported by Science Foundation Ireland under Grant No. SFI/02/CE1/I131. 相似文献
6.
During the last years a new generation of process-aware information systems has emerged, which enables process model configurations at buildtime as well as process instance changes during runtime. Respective model adaptations result in a large number of model variants that are derived from the same process model, but slightly differ in structure. Generally, such model variants are expensive to configure and maintain. In this paper we address two scenarios for learning from process model adaptations and for discovering a reference model out of which the variants can be configured with minimum efforts. The first one is characterized by a reference process model and a collection of related process variants. The goal is to improve the original reference process model such that it fits better to the variant models. The second scenario comprises a collection of process variants, while the original reference model is unknown; i.e., the goal is to “merge” these variants into a new reference process model. We suggest two algorithms that are applicable in both scenarios, but have their pros and cons. We provide a systematic comparison of the two algorithms and further contrast them with conventional process mining techniques. Comparison results indicate good performance of our algorithms and also show that specific techniques are needed for learning from process configurations and adaptations. Finally, we provide results from a case study in automotive industry in which we successfully applied our algorithms. 相似文献
7.
Lei Shi 《Computational statistics & data analysis》2012,56(1):202-208
Deletion, replacement and mean-shift model are three approaches frequently used to detect influential observations and outliers. For general linear model with known covariance matrix, it is known that these three approaches lead to the same update formulae for the estimates of the regression coefficients. However if the covariance matrix is indexed by some unknown parameters which also need to be estimated, the situation is unclear. In this paper, we show under a common subclass of linear mixed models that the three approaches are no longer equivalent. For maximum likelihood estimation, replacement is equivalent to mean-shift model but both are not equivalent to case deletion. For restricted maximum likelihood estimation, mean-shift model is equivalent to case deletion but both are not equivalent to replacement. We also demonstrate with real data that misuse of replacement and mean-shift model in place of case deletion can lead to incorrect results. 相似文献
8.
Scalable,parallel computers: Alternatives,issues, and challenges 总被引:3,自引:0,他引:3
Gordon Bell 《International journal of parallel programming》1994,22(1):3-46
The 1990s will be the era of scalable computers. By giving up uniform memory access, computers can be built that scale over
a range of several thousand. These provide highpeak announced performance (PAP), by using powerful, distributed CMOS microprocessor-primary memory pairs interconnected by a high performance switch
(network). The parameters that determine these structures and their utility include: whether hardware (a multiprocessor) or
software (a multicomputer) is used to maintain a distributed, or shared virtual memory (DSM) environemnt; the power of computing
nodes (these improve at 60% per year); the size and scalability of the switch; distributability (the ability to connect to
geographically dispersed computers including workstations); and all forms of software to exploit their inherent parallelism.
To a great extent, viability is determined by a computer's generality—the ability to efficiently handle a range of work that
requires varying processing (from serial to fully parallel), memory, and I/O resources. A taxonomy and evolutionary time line
outlines the next decade of computer evolution, included distributed workstations, based on scalability and parallelism. Workstations
can be the best scalables. 相似文献
9.
《Information & Management》2016,53(2):183-196
This study presents a new concept called information systems control alignment, which examines the degree that the underlying characteristics of four main information systems (IS) control dimensions are mutually complementary. Using three case studies, our research uncovers two high-functioning control patterns – one with traditional characteristics and one with agile characteristics – that demonstrate positive alignment among the control environment, control mechanisms, socio-emotional behaviors, and execution of controls. By better understanding the circumstances that contribute to control conflicts, organizations can be increasingly mindful of cultivating a complementary relationship among the control dimensions when designing, implementing, monitoring and adjusting controls within IS processes. 相似文献
10.
11.
With today’s global digital environment, the Internet is readily accessible anytime from everywhere, so does the digital image
manipulation software; thus, digital data is easy to be tampered without notice. Under this circumstance, integrity verification
has become an important issue in the digital world. The aim of this paper is to present an in-depth review and analysis on
the methods of detecting image tampering. We introduce the notion of content-based image authentication and the features required
to design an effective authentication scheme. We review major algorithms and frequently used security mechanisms found in
the open literature. We also analyze and discuss the performance trade-offs and related security issues among existing technologies. 相似文献
12.
《Expert systems with applications》2014,41(11):5340-5352
A business process (BP) consists of a set of activities which are performed in coordination in an organizational and technical environment and which jointly realize a business goal. In such context, BP management (BPM) can be seen as supporting BPs using methods, techniques, and software in order to design, enact, control, and analyze operational processes involving humans, organizations, applications, and other sources of information. Since the accurate management of BPs is receiving increasing attention, conformance checking, i.e., verifying whether the observed behavior matches a modelled behavior, is becoming more and more critical. Moreover, declarative languages are more frequently used to provide an increased flexibility. However, whereas there exist solid conformance checking techniques for imperative models, little work has been conducted for declarative models. Furthermore, only control-flow perspective is usually considered although other perspectives (e.g., data) are crucial. In addition, most approaches exclusively check the conformance without providing any related diagnostics. To enhance the accurate management of flexible BPs, this work presents a constraint-based approach for conformance checking over declarative BP models (including both control-flow and data perspectives). In addition, two constraint-based proposals for providing related diagnosis are detailed. To demonstrate both the effectiveness and the efficiency of the proposed approaches, the analysis of different performance measures related to a wide diversified set of test models of varying complexity has been performed. 相似文献
13.
14.
数据挖掘技术及其在过程监控中的应用 总被引:2,自引:0,他引:2
数据挖掘技术作为一个新兴的技术在许多领域都有成功的应用,本文从数据挖掘的定义及特点,数据挖掘的步骤以及数据挖掘技术的分类三个方面对数据挖掘技术做了概述,并对数据挖掘技术在过程监控中的应用进行了探讨。 相似文献
15.
作为高校新设立专业,物联网专业毕业生将拥有国家政策支持、物联网产值及人才需求倍增等有利条件,但同时也面临新专业从师资、人才培养模式到物联网硬件建设以及学生主体意识增强与企业人才要求提高等方面的挑战。文中对高校物联网专业的发展方向和人才培养目标,校企合作培养模式的创新以及整合学校各方资源,完善学生就业的指导服务等方面进行了分析,以保证物联网专业学生就业工作的健康发展。 相似文献
16.
Pervasive location acquisition technologies: Opportunities and challenges for geospatial studies 总被引:5,自引:0,他引:5
The rapid development and increasing availability of various location acquisition technologies provide geospatial studies with both opportunities and challenges. These opportunities and challenges are discussed in this paper focusing on the following three aspects: the massive acquisition of location data and data quality, the analysis of massive location data and pattern discovery, and privacy protection for massive location data. This paper examines the current status of and the potential opportunities for geospatial research in these three areas and notes the major challenges. Finally, the development of this special issue is described, and the four articles included in this special issue are presented. 相似文献
17.
Tatiana von Landesberger Sebastian Bremm Matthias Kirschner Stefan Wesarg Arjan Kuijper 《Expert systems with applications》2013,40(12):4934-4943
Segmentation of medical images is a prerequisite in clinical practice. Many segmentation algorithms use statistical shape models. Due to the lack of tools providing prior information on the data, standard models are frequently used. However, they do not necessarily describe the data in an optimal way. Model-based segmentation can be supported by Visual Analytics tools, which give the user a deeper insight into the correspondence between data and model result. Combining both approaches, better models for segmentation of organs in medical images are created.In this work, we identify the main tasks and problems in model-based image segmentation. As a proof of concept, we show that already small visual-interactive extensions can be very beneficial. Based on these results, we present research challenges for Visual Analytics in this area. 相似文献
18.
Touqeer Haseeb Zaman Shakir Amin Rashid Hussain Mudassar Al-Turjman Fadi Bilal Muhammad 《The Journal of supercomputing》2021,77(12):14053-14089
The Journal of Supercomputing - The Internet of Things is a rapidly evolving technology in which interconnected computing devices and sensors share data over the network to decipher different... 相似文献
19.
Nigel Collier Ai Kawazoe Lihua Jin Mika Shigematsu Dinh Dien Roberto A. Barrero Koichi Takeuchi Asanee Kawtrakul 《Language Resources and Evaluation》2006,40(3-4):405-413
A lack of surveillance system infrastructure in the Asia-Pacific region is seen as hindering the global control of rapidly
spreading infectious diseases such as the recent avian H5N1 epidemic. As part of improving surveillance in the region, the
BioCaster project aims to develop a system based on text mining for automatically monitoring Internet news and other online
sources in several regional languages. At the heart of the system is an application ontology which serves the dual purpose
of enabling advanced searches on the mined facts and of allowing the system to make intelligent inferences for assessing the
priority of events. However, it became clear early on in the project that existing classification schemes did not have the
necessary language coverage or semantic specificity for our needs. In this article we present an overview of our needs and
explore in detail the rationale and methods for developing a new conceptual structure and multilingual terminological resource
that focusses on priority pathogens and the diseases they cause. The ontology is made freely available as an online database
and downloadable OWL file. 相似文献
20.
MPC: Current practice and challenges 总被引:1,自引:0,他引:1
Linear Model Predictive Control (MPC) continues to be the technology of choice for constrained, multivariable control applications in the process industry. Successful deployment of MPC requires “getting right” multiple aspects of the control problem. This includes the design of the underlying regulatory controls, design of the MPC(s), test design for model identification, model development, and dealing with nonlinearities. Approaches and techniques that are successfully applied in practice are described, including the challenges involved in ensuring a successful MPC application. Academic contributions are highlighted and suggestions provided for improving MPC. 相似文献