Robotic process automation is a disruptive technology to automate already digital yet manual tasks and subprocesses as well as whole business processes rapidly. In contrast to other process automation technologies, robotic process automation is lightweight and only accesses the presentation layer of IT systems to mimic human behavior. Due to the novelty of robotic process automation and the varying approaches when implementing the technology, there are reports that up to 50% of robotic process automation projects fail. To tackle this issue, we use a design science research approach to develop a framework for the implementation of robotic process automation projects. We analyzed 35 reports on real-life projects to derive a preliminary sequential model. Then, we performed multiple expert interviews and workshops to validate and refine our model. The result is a framework with variable stages that offers guidelines with enough flexibility to be applicable in complex and heterogeneous corporate environments as well as for small and medium-sized companies. It is structured by the three phases of initialization, implementation, and scaling. They comprise eleven stages relevant during a project and as a continuous cycle spanning individual projects. Together they structure how to manage knowledge and support processes for the execution of robotic process automation implementation projects.
Worst-case execution time (WCET) analysis is concerned with computing a precise-as-possible bound for the maximum time the execution of a program can
take. This information is indispensable for developing safety-critical real-time systems, e. g., in the avionics and automotive
fields. Starting with the initial works of Chen, Mok, Puschner, Shaw, and others in the mid and late 1980s, WCET analysis
turned into a well-established and vibrant field of research and development in academia and industry. The increasing number
and diversity of hardware and software platforms and the ongoing rapid technological advancement became drivers for the development
of a wide array of distinct methods and tools for WCET analysis. The precision, generality, and efficiency of these methods
and tools depend much on the expressiveness and usability of the annotation languages that are used to describe feasible and infeasible program paths. In this article we survey the annotation languages which
we consider formative for the field. By investigating and comparing their individual strengths and limitations with respect
to a set of pivotal criteria, we provide a coherent overview of the state of the art. Identifying open issues, we encourage
further research. This way, our approach is orthogonal and complementary to a recent approach of Wilhelm et al. who provide
a thorough survey of WCET analysis methods and tools that have been developed and used in academia and industry. 相似文献
In recent years, many usability evaluation methods (UEMs) have been employed to evaluate Web applications. However, many of these applications still do not meet most customers’ usability expectations and many companies have folded as a result of not considering Web usability issues. No studies currently exist with regard to either the use of usability evaluation methods for the Web or the benefits they bring.
Objective
The objective of this paper is to summarize the current knowledge that is available as regards the usability evaluation methods (UEMs) that have been employed to evaluate Web applications over the last 14 years.
Method
A systematic mapping study was performed to assess the UEMs that have been used by researchers to evaluate Web applications and their relation to the Web development process. Systematic mapping studies are useful for categorizing and summarizing the existing information concerning a research question in an unbiased manner.
Results
The results show that around 39% of the papers reviewed reported the use of evaluation methods that had been specifically crafted for the Web. The results also show that the type of method most widely used was that of User Testing. The results identify several research gaps, such as the fact that around 90% of the studies applied evaluations during the implementation phase of the Web application development, which is the most costly phase in which to perform changes. A list of the UEMs that were found is also provided in order to guide novice usability practitioners.
Conclusions
From an initial set of 2703 papers, a total of 206 research papers were selected for the mapping study. The results obtained allowed us to reach conclusions concerning the state-of-the-art of UEMs for evaluating Web applications. This allowed us to identify several research gaps, which subsequently provided us with a framework in which new research activities can be more appropriately positioned, and from which useful information for novice usability practitioners can be extracted. 相似文献
Chvátal-Gomory cuts are among the most well-known classes of cutting planes for general integer linear programs (ILPs). In
case the constraint multipliers are either 0 or
, such cuts are known as
-cuts. It has been proven by Caprara and Fischetti (Math. Program. 74:221–235, 1996) that separation of
-cuts is
-hard.
In this paper, we study ways to separate
-cuts effectively in practice. We propose a range of preprocessing rules to reduce the size of the separation problem. The
core of the preprocessing builds a Gaussian elimination-like procedure. To separate the most violated
-cut, we formulate the (reduced) problem as integer linear program. Some simple heuristic separation routines complete the
algorithmic framework.
Computational experiments on benchmark instances show that the combination of preprocessing with exact and/or heuristic separation
is a very vital idea to generate strong generic cutting planes for integer linear programs and to reduce the overall computation
times of state-of-the-art ILP-solvers. 相似文献
Mainstream business process modelling techniques often promote a design paradigm wherein the activities that may be performed
within a case, together with their usual execution order, form the backbone on top of which other aspects are anchored. This
Fordist paradigm, while effective in standardised and production-oriented domains, breaks when confronted with processes in
which case-by-case variations and exceptions are the norm. We contend that the effective design of flexible processes calls
for a substantially different modelling paradigm. Motivated by requirements from the human services domain, we explore the
hypothesis that a framework consisting of a small set of coordination concepts, combined with established object-oriented
modelling principles, provides a suitable foundation for designing highly flexible processes. Several human service delivery
processes have been designed using this framework, and the resulting models have been used to realise a system to support
these processes in a pilot environment. 相似文献
This paper constructs multirate linear multistep time discretizations based on Adams-Bashforth methods. These methods are
aimed at solving conservation laws and allow different timesteps to be used in different parts of the spatial domain. The
proposed family of discretizations is second order accurate in time and has conservation and linear and nonlinear stability
properties under local CFL conditions. Multirate timestepping avoids the necessity to take small global timesteps—restricted
by the largest value of the Courant number on the grid—and therefore results in more efficient computations. Numerical results
obtained for the advection and Burgers’ equations confirm the theoretical findings.
This work was supported by the National Science Foundation through award NSF CCF-0515170. 相似文献
Aggregate scattering operators (ASOs) describe the overall scattering behavior of an asset (i.e., an object or volume, or collection thereof) accounting for all orders of its internal scattering. We propose a practical way to precompute and compactly store ASOs and demonstrate their ability to accelerate path tracing. Our approach is modular avoiding costly and inflexible scene‐dependent precomputation. This is achieved by decoupling light transport within and outside of each asset, and precomputing on a per‐asset level. We store the internal transport in a reduced‐dimensional subspace tailored to the structure of the asset geometry, its scattering behavior, and typical illumination conditions, allowing the ASOs to maintain good accuracy with modest memory requirements. The precomputed ASO can be reused across all instances of the asset and across multiple scenes. We augment ASOs with functionality enabling multi‐bounce importance sampling, fast short‐circuiting of complex light paths, and compact caching, while retaining rapid progressive preview rendering. We demonstrate the benefits of our ASOs by efficiently path tracing scenes containing many instances of objects with complex inter‐reflections or multiple scattering. 相似文献
In this work, a method for fast design optimization of broadband antennas is considered. The approach is based on a feature‐based optimization (FBO) concept where reflection characteristics of the structure at hand are formulated in terms of suitably defined feature points. Redefinition of the design problem allows for reducing the design optimization cost, because the dependence of feature point coordinates on antenna dimensions is less nonlinear than for the original frequency characteristics (here, S‐parameters). This results in faster convergence of the optimization algorithm. The cost of the design process is further reduced using variable‐fidelity electromagnetic (EM) simulation models. In case of UWB antennas, the feature points are defined, among others, as the levels of the reflection characteristic at its local in‐band maxima, as well as location of the frequency point which corresponds to acceptable reflection around the lower corner frequency within the UWB band. Also, the number of characteristic points depends on antenna topology and its dimensions. Performance of FBO‐based design optimization is demonstrated using two examples of planar UWB antennas. Moreover, the computational cost of the approach is compared with conventional optimization driven by a pattern search algorithm. Experimental validation of the numerical results is also provided. 相似文献
The null controllable set of a system is the largest set of states that can be controlled to the origin. Control systems that have a region of attraction equal to the null controllable set are said to be maximally controllable closed-loop systems. In the case of open-loop unstable plants with amplitude constrained control it is well known that the null controllable set does not cover the entire state-space. Further the combination of input constraints and unstable system dynamics results in a set of state constraints which we call implicit constraints. It is shown that the simple inclusion of implicit constraints in a controller formulation results in a controller that achieves maximal controllability for a class of open-loop unstable systems. 相似文献
The discovery of meaningful parts of a shape is required for many geometry processing applications, such as parameterization, shape correspondence, and animation. It is natural to consider primitives such as spheres, cylinders and cones as the building blocks of shapes, and thus to discover parts by fitting such primitives to a given surface. This approach, however, will break down if primitive parts have undergone almost‐isometric deformations, as is the case, for example, for articulated human models. We suggest that parts can be discovered instead by finding intrinsic primitives, which we define as parts that posses an approximate intrinsic symmetry. We employ the recently‐developed method of computing discrete approximate Killing vector fields (AKVFs) to discover intrinsic primitives by investigating the relationship between the AKVFs of a composite object and the AKVFs of its parts. We show how to leverage this relationship with a standard clustering method to extract k intrinsic primitives and remaining asymmetric parts of a shape for a given k. We demonstrate the value of this approach for identifying the prominent symmetry generators of the parts of a given shape. Additionally, we show how our method can be modified slightly to segment an entire surface without marking asymmetric connecting regions and compare this approach to state‐of‐the‐art methods using the Princeton Segmentation Benchmark. 相似文献