首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 15 毫秒
1.
Mobile Ambients (MA) have acquired a fundamental role in modelling mobility in systems with mobile code and mobile devices, and in computation over administrative domains. We present the stochastic version of Mobile Ambients, called Stochastic Mobile Ambients (SMA), where we extend MA with time and probabilities. Inspired by previous models, PEPA and Sπ, we enhance the prefix of the capabilities with a rate and the ambient with a linear function that operates on the rates of processes executing inside it. The linear functions associated with ambients represent the delays that govern particular administrative domains. We derive performance measures from the labelled transition semantics as in standard models. We also define a strong Markov bisimulation in the style of reduction semantics known as barbed bisimulation. We argue that performance measures are of vital importance in designing any kind of distributed system, and that SMA can be useful in the design of the complicated mobile systems.  相似文献   

2.
3.
In this paper we study Make-To-Stock manufacturing systems and seek on-line algorithms for determining optimal or near optimal buffer capacities (hedging points) that balance inventory against stockout costs. Using a zStochastic Fluid Model (SFM), we derive sample derivatives (sensitivities) which, under very weak structural assumptions on the defining demand and service processes, are shown to be unbiased estimators of the sensitivities of a cost function with respect to these capacities. When evaluated based on the sample path of discrete-part systems, we show that these estimators are greatly simplified. Thus, they can be easily implemented and evaluated on line. Though the implementation on discrete-part systems does not necessarily preserve the unbiasedness property, simulation results show that stochastic approximation algorithms that use such estimates do converge to optimal or near optimal hedging points. Supported in part by the National Science Foundation under Grants EEC-0088073 and DMI-0330171, by AFOSR under contract F49620-01-0056, and by ARO under grant DAAD19-01-0610.  相似文献   

4.
A number of important problems in production and inventory control involve optimization of multiple threshold levels or hedging points. We address the problem of finding such levels in a stochastic system whose dynamics can be modelled using generalized semi-Markov processes (GSMP). The GSMP framework enables us to compute several performance measures and their sensitivities from a single simulation run for a general system with several states and fairly general state transitions. We then use a simulation-based optimization method, sample-path optimization, for finding optimal hedging points. We report numerical results for systems with more than twenty hedging points and service-level type probabilistic constraints. In these numerical studies, our method performed quite well on problems which are considered very difficult by current standards. Some applications falling into this framework include designing manufacturing flow controllers, using capacity options and subcontracting strategies, and coordinating production and marketing activities under demand uncertainty.  相似文献   

5.
This paper presents a stochastic model for the normalized least-mean-square (NLMS) algorithm operating in a nonstationary environment with complex-valued Gaussian input data. To derive this model, several approximations commonly used in the modeling of algorithms with normalized step size are avoided, thus giving rise to very accurate model expressions describing the algorithm behavior in both transient and steady-state phases. Such accuracy comes mainly from the strategy used for computing the normalized autocorrelation-like matrices arising from the model development, for which analytical solutions are also derived here. In addition, based on the proposed model expressions, the impact of the algorithm parameters on its performance is discussed, clarifying the tracking properties of the NLMS algorithm in a nonstationary environment. Through simulation results, the effectiveness of the proposed model is assessed for different operating scenarios.  相似文献   

6.
This paper studies the system transformation using generalized orthonormal basis functions that include the Laguerre basis as a special case. The transformation of the deterministic systems is studied in the literature, which is called the Hambo transform. The aim of the paper is to develop a transformation theory for stochastic systems. The paper establishes the equivalence of continuous and transformed-discrete-time stochastic systems in terms of solutions. The method is applied to the continuous-time system identification problem. It is shown that using the transformed signals the PO-MOESP subspace identification algorithm yields consistent estimates for system matrices. An example is included to illustrate the efficacy of the proposed identification method, and to make a comparison with the method using the Laguerre filter.  相似文献   

7.
Software Defined Networking (SDN) is a new network design paradigm that aims at simplifying the implementation of complex networking infrastructures by separating the forwarding functionalities (data plane) from the network logical control (control plane). Network devices are used only for forwarding, while decisions about where data is sent are taken by a logically centralized yet physically distributed component, i.e., the SDN controller. From a quality of service (QoS) point of view, an SDN controller is a complex system whose operation can be highly dependent on a variety of parameters, e.g., its degree of distribution, the corresponding topology, the number of network devices to control, and so on. Dependability aspects are particularly critical in this context. In this work, we present a new analytical modeling technique that allows us to represent an SDN controller whose components are organized in a hierarchical topology, focusing on reliability and availability aspects and overcoming issues and limitations of Markovian models. In particular, our approach allows to capture changes in the operating conditions (e.g., in the number of managed devices) still allowing to represent the underlying phenomena through generally distributed events. The dependability of a use case on a two-layer hierarchical SDN control plane is investigated through the proposed technique providing numerical results to demonstrate the feasibility of the approach.  相似文献   

8.
Stochastic robustness metric and its use for static resource allocations   总被引:2,自引:0,他引:2  
This research investigates the problem of robust static resource allocation for distributed computing systems operating under imposed Quality of Service (QoS) constraints. Often, such systems are expected to function in a physical environment replete with uncertainty, which causes the amount of processing required to fluctuate substantially over time. Determining a resource allocation that accounts for this uncertainty in a way that can provide a probabilistic guarantee that a given level of QoS is achieved is an important research problem. The stochastic robustness metric proposed in this research is based on a mathematical model where the relationship between uncertainty in system parameters and its impact on system performance are described stochastically.The utility of the established metric is then exploited in the design of optimization techniques based on greedy and iterative approaches that address the problem of resource allocation in a large class of distributed systems operating on periodically updated data sets. The performance results are presented for a simulated environment that replicates a heterogeneous cluster-based radar data processing center. A mathematical performance lower bound is presented for comparison analysis of the heuristic results. The lower bound is derived based on a relaxation of the Integer Linear Programming formulation for a given resource allocation problem.  相似文献   

9.
10.
Efficient randomized algorithms are developed for solving robust feasibility problems with multiple parameter-dependent convex constraints. Two complementary strategies are presented, both of which exploit the multiplicity to achieve fast convergence. One is the stochastic ellipsoid method with multiple updates. In each iteration of this algorithm, an ellipsoid which describes a candidate of the solution set is updated many times via the multiple constraints with one random sample, while at most one update is allowed in the original method. The other is the stochastic ellipsoid method with multiple cuts. Here, a new update rule is presented to construct a smaller ellipsoid directly via multiple subgradients given by the constraints. A quantitative analysis of the volume of the ellipsoid is also provided, which guarantees the advantage of the proposed algorithm over the original one. The above features lead to a reduction of the total number of random samples necessary for convergence, which is extensively demonstrated through numerical examples.  相似文献   

11.
A Gaussian operator basis provides a means to formulate phase-space simulations of the real- and imaginary-time evolution of quantum systems. Such simulations are guaranteed to be exact while the underlying distribution remains well-bounded, which defines a useful simulation time. We analyse the application of the Gaussian phase-space representation to the dynamics of the dissociation of an ultra-cold molecular gas. We show how the choice of mapping to stochastic differential equations can be used to tailor the stochastic behaviour, and thus the useful simulation time. In the phase-space approach, it is only averages of stochastic trajectories that have a direct physical meaning. Whether particular constants of the motion are satisfied by individual trajectories depends on the choice of mapping, as we show in examples.  相似文献   

12.
Automated error analysis for the agilization of feature modeling   总被引:1,自引:0,他引:1  
P.  D.  A.  A.  M.   《Journal of Systems and Software》2008,81(6):883-896
Software Product Lines (SPL) and agile methods share the common goal of rapidly developing high-quality software. Although they follow different approaches to achieve it, some synergies can be found between them by (i) applying agile techniques to SPL activities so SPL development becomes more agile; and (ii) tailoring agile methodologies to support the development of SPL. Both options require an intensive use of feature models, which are usually strongly affected by changes on requirements. Changing large-scale feature models as a consequence of changes on requirements is a well-known error-prone activity. Since one of the objectives of agile methods is a rapid response to changes in requirements, it is essential an automated error analysis support in order to make SPL development more agile and to produce error-free feature models.

As a contribution to find the intended synergies, this article sets the basis to provide an automated support to feature model error analysis by means of a framework which is organized in three levels: a feature model level, where the problem of error treatment is described; a diagnosis level, where an abstract solution that relies on Reiter’s theory of diagnosis is proposed; and an implementation level, where the abstract solution is implemented by using Constraint Satisfaction Problems (CSP).

To show an application of our proposal, a real case study is presented where the Feature-Driven Development (FDD) methodology is adapted to develop an SPL. Current proposals on error analysis are also studied and a comparison among them and our proposal is provided. Lastly, the support of new kinds of errors and different implementation levels for the proposed framework are proposed as the focus of our future work.  相似文献   


13.
Forecasting, using historic time-series data, has become an important tool for fisheries management. ARIMA modeling, Modeling for Optimal Forecasting techniques and Decision Support Systems based on fuzzy mathematics may be used to predict the general trend of a given fish landings time-series with increased reliability and accuracy. The present paper applies these three modeling methods to forecast anchovy fish catches landed in a given port (Thessaloniki, Greece) during 1979–2000 and hake and bonito total fish catches during 1982–2000. The paper attempts to assess the model's accuracy by comparing model results to the actual monthly fish catches of the year 2000. According to the measures of forecasting accuracy established, the best forecasting performance for anchovy was shown by the DSS model (MAPE = 28.06%, RMSE = 76.56, U-statistic = 0.67 and R2 = 0.69). The optimal forecasting technique of genetic modeling improved significantly the forecasting values obtained by the selected ARIMA model. Similarly, the DSS model showed a noteworthy forecasting efficiency for the prediction of hake landings, during the year 2000 (MAPE = 2.88%, RMSE = 13.75, U-statistic = 0.19 and R2 = 0.98), as compared to the other two modeling techniques. Optimal forecasting produced by combined modeling scored better than application of the simple ARIMA model. Overall, DSS results showed that the Fuzzy Expected Intervals methodology could be used as a very reliable tool for short-term predictions of fishery landings.  相似文献   

14.
Iterative analysis of Markov regenerative models   总被引:3,自引:0,他引:3  
Conventional algorithms for the steady-state analysis of Markov regenerative models suffer from high computational costs which are caused by densely populated matrices. In this paper, a new algorithm is suggested which avoids computing these matrices explicitly. Instead, a two-stage iteration scheme is used. An extended version of uniformization is applied as a subalgorithm to compute the required transient quantities “on-the fly”. The algorithm is formulated in terms of stochastic Petri nets. A detailed example illustrates the proposed concepts.  相似文献   

15.
A postprocessing method based on information degree is developed to solve the small-scale variation (noise) in reservoir stochastic modeling. Considering that different modeling results have different probabilities and credits, the new method uses the information degree calculated by the probabilities as weights to process the noise. Compared with the traditional postprocessing methods, this method is geologically more reasonable in that it considers both the information provided by the conditional data and the uncertainties associated with random sampling during simulation. The computation of information degree is objective, which avoids the subjective assignments of weight values in the traditional methods. Comparative studies using both conceptual and real reservoir models show that the new method effectively processes the noise in realizations. Thus, it is a prospective approach to the postprocessing family in stochastic modeling.  相似文献   

16.
Data teletraffic is characterized by bursty arrival processes. Performance models are characterized by a desire to know under what circumstances is the probability that an arrival finds a full input buffer very small. In this paper I examine how four models proposed in the literature perform on two data sets of local area network traffic. Among my conclusion are (1) the protocol governing the data transmission may have a substantial effect on the statistical propoerties on the packet stream, (2) approximating the probability that a finit buffer of size b overflows may not be adequately approximated by the probability that an infinite buffer has at least b packets in it, and (3) a data-based estimate of large-deviation rate-function does the best job of estimating packet loss on these data sets. This method may overestimate the loss rate by several orders of magnitude, so there is room for further refinements.  相似文献   

17.
18.
19.
A study on traffic characterization of the Internet is essential to design the Internet infrastructure. In this paper, we first characterize WWW (World Wide Web) traffic based on the access log data obtained at four different servers. We find that the document size, the request inter-arrival time and the access frequency of WWW traffic follow heavy-tail distributions. Namely, the document size and the request inter-arrival time follow log-normal distributions, and the access frequency does the Pareto distribution. For the request inter-arrival time, however, an exponential distribution becomes adequate if we are concerned with the busiest hours. Based on our analytic results, we next build an M/G/1/PS queuing model to discuss a design methodology of the Internet access network. The accuracy of our model is validated by comparing with the trace-driven simulation. We also investigate the effect of document caching at the Proxy server on the WWW traffic characteristics. The results show that the traffic volume is actually reduced by the document replacement policies, but the traffic characteristics are not much affected. It suggests that our modeling approach can be applied to the case with document caching, which is demonstrated by simulation experiments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号