Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates. 相似文献
We introduce N‐PolyVector fields, a generalization of N‐RoSy fields for which the vectors are neither necessarily orthogonal nor rotationally symmetric. We formally define a novel representation for N‐PolyVectors as the root sets of complex polynomials and analyze their topological and geometric properties. A smooth N‐PolyVector field can be efficiently generated by solving a sparse linear system without integer variables. We exploit the flexibility of N‐PolyVector fields to design conjugate vector fields, offering an intuitive tool to generate planar quadrilateral meshes. 相似文献
We present a novel framework for polyhedral mesh editing with face‐based projective maps that preserves planarity by definition. Such meshes are essential in the field of architectural design and rationalization. By using homogeneous coordinates to describe vertices, we can parametrize the entire shape space of planar‐preserving deformations with bilinear equations. The generality of this space allows for polyhedral geometric processing methods to be conducted with ease. We demonstrate its usefulness in planar‐quadrilateral mesh subdivision, a resulting multi‐resolution editing algorithm, and novel shape‐space exploration with prescribed transformations. Furthermore, we show that our shape space is a discretization of a continuous space of conjugate‐preserving projective transformation fields on surfaces. Our shape space directly addresses planar‐quad meshes, on which we put a focus, and we further show that our framework naturally extends to meshes with faces of more than four vertices as well. 相似文献
This paper presents an empirical study of control logic specifications used to document industrial control logic code in manufacturing applications. More than one hundred input/output related property specifications from ten different reusable function blocks were investigated. The main purpose of the study was to provide understanding of how the specifications are expressed by industrial practitioners, in order to develop new tools and methods for specifying control logic software, as well as for evaluating existing ones. In this paper, the studied specifications are used to evaluate linear temporal logic in general and the specification language ST-LTL, tailored for functions blocks, in particular. The study shows that most specifications are expressed as implications, that should always be fulfilled, between input and output conditions. Many of these implications are complex since the input and output conditions may be mixed and involve sequences, timer issues and non-boolean variables. Using ST-LTL it was possible to represent all implications of this study. The few non-implication specifications could be specified in ST-LTL as well after being altered to suit the specification language. The paper demonstrates some advantages of ST-LTL compared to standard linear temporal logic and discusses possible improvements such as support for automatic rewrite of complex specifications. 相似文献
Regression via classification (RvC) is a method in which a regression problem is converted into a classification problem. A discretization process is used to covert continuous target value to classes. The discretized data can be used with classifiers as a classification problem. In this paper, we use a discretization method, Extreme Randomized Discretization, in which bin boundaries are created randomly to create ensembles. We present an ensemble method for RvC problems. We show theoretically for a set of problems that if the number of bins is three, the proposed ensembles for RvC perform better than RvC with the equal-width discretization method. We use these results to show that infinite-sized ensembles, consisting of finite-sized decision trees, created by a pure randomized method (split points are created randomly), are not consistent. We also theoretically show, using a set of regression problems, that the performance of these ensembles is dependent on the size of member decision trees. 相似文献
The statistical selectivity models were developed for four different Fischer–Tropsch synthesis product range, including methane (CH4), light olefins (C2=C4), light paraffins (C2–C4), and long-chain hydrocarbons (C5+), based on the experimental data obtained over thirteen γ-Al2O3 supported cobalt-based catalysts with different cobalt particle and pore sizes. The input variables consist of cobalt metal particle size and catalyst pore size. The cubic and quadratic polynomial equations were fitted to the experimental data, however, the mathematical models were subjected to model reduction for the enhancement of model adequacy, which was investigated through ANOVA. The multi-objective optimization revealed that the maximum C5+?selectivity (84.150%) could be achieved at the cobalt particle size and pore sizes of 14.764 and 23.129 nm, respectively, while keeping the selectivity to other hydrocarbon products minimum.
The interaction of cholesterol with ceramides containing α-hydroxy fatty acyl chains (hydroxyceramides) has been studies as
a foundation for characterizing the lipid bilayers of thestratum corneum. A relatively large quantity of cerebrosides was obtained from bovine brain and converted to ceramides through removal of
the carbohydrate side chain. The ceramides were separated based on the absence or presence of hydroxy fatty acyl chains. The
lyophilized hydroxyceramides showed a broad melting region at 92°C. Hydroxyceramides dispersed in water produced a relatively
narrow, thermotropic transition at 75°C. The effect of cholesterol on this thermotropic phase transition of hydroxyceramides
was determined by differential scanning calorimetry. With respect to the main transition, cholesterol caused a broadening
of the phase transition at relatively low levels as well as a decrease in the peak transition temperature. The presence of
cholesterol at levels in excess of 7 wt% gave rise to an additional low-temperature transition at 55°C. Upon immediate rescanning,
this transition was exothermic, but with increasing incubation time the area under the excess heat capacity curve as a function
of temperature became smaller. After two days or more, the transition observed was endothermic. At cholesterol levels between
40 and 50 wt%, multiple peaks were observed. From comparisons with related systems, the cooperative thermal transitions of
hydroxyceramides with cholesterol are suggested to result from changes in hydrogen bonding or be due to phase separation.
The composition of isolated brain ceramides is being compared with that reported for thestratum corneum. 相似文献
Multimedia Tools and Applications - The High Efficiency Video Coding (HEVC) efficiently reduces the size of the multimedia contents, but at the cost of high computation complexity. In order to make... 相似文献