首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
As software applications become highly interconnected in dynamically provisioned platforms, they form the so-called systems-of-systems. Therefore, a key issue that arises in such environments is whether specific requirements are violated, when these applications interact in new unforeseen ways as new resources or system components are dynamically provisioned. Such environments require the continuous use of frameworks for assessing compliance against specific mission critical system requirements. Such frameworks should be able to (a) handle large requirements models, (b) assess system compliance repeatedly and frequently using events from possibly high velocity and high frequency data streams, and (c) use models that can reflect the vagueness that inherently exists in big data event collection and in modeling dependencies between components of complex and dynamically re-configured systems. In this paper, we introduce a framework for run time reasoning over medium and large-scale fuzzy goal models, and we propose a process which allows for the parallel evaluation of such models. The approach has been evaluated for time and space performance on large goal models, exhibiting that in a simulation environment, the parallel reasoning process offers significant performance improvement over a sequential one.  相似文献   

2.
Models@ run.time   总被引:2,自引:0,他引:2  
Blair  G. Bencomo  N. France  R.B. 《Computer》2009,42(10):22-27
Runtime adaptation mechanisms that leverage software models extend the applicability of model-driven engineering techniques to the runtime environment. Contemporary mission-critical software systems are often expected to safely adapt to changes in their execution environment. Given the critical roles these systems play, it is often inconvenient to take them offline to adapt their functionality. Consequently, these systems are required, when feasible, to adapt their behavior at runtime with little or no human intervention. A promising approach to managing complexity in runtime environments is to develop adaptation mechanisms that leverage software models, referred to as models@run. time. Work on models@run.time seeks to extend the applicability of models produced in model-driven engineering (MDE) approaches to the runtime environment. Models@run. time is a causally connected self-representation of the associated system that emphasizes the structure, behavior, or goals of the system from a problem space perspective.  相似文献   

3.
Detecting trend and seasonal changes in satellite image time series   总被引:9,自引:0,他引:9  
A wealth of remotely sensed image time series covering large areas is now available to the earth science community. Change detection methods are often not capable of detecting land cover changes within time series that are heavily influenced by seasonal climatic variations. Detecting change within the trend and seasonal components of time series enables the classification of different types of changes. Changes occurring in the trend component often indicate disturbances (e.g. fires, insect attacks), while changes occurring in the seasonal component indicate phenological changes (e.g. change in land cover type). A generic change detection approach is proposed for time series by detecting and characterizing Breaks For Additive Seasonal and Trend (BFAST). BFAST integrates the decomposition of time series into trend, seasonal, and remainder components with methods for detecting change within time series. BFAST iteratively estimates the time and number of changes, and characterizes change by its magnitude and direction. We tested BFAST by simulating 16-day Normalized Difference Vegetation Index (NDVI) time series with varying amounts of seasonality and noise, and by adding abrupt changes at different times and magnitudes. This revealed that BFAST can robustly detect change with different magnitudes (> 0.1 NDVI) within time series with different noise levels (0.01-0.07 σ) and seasonal amplitudes (0.1-0.5 NDVI). Additionally, BFAST was applied to 16-day NDVI Moderate Resolution Imaging Spectroradiometer (MODIS) composites for a forested study area in south eastern Australia. This showed that BFAST is able to detect and characterize spatial and temporal changes in a forested landscape. BFAST is not specific to a particular data type and can be applied to time series without the need to normalize for land cover types, select a reference period, or change trajectory. The method can be integrated within monitoring frameworks and used as an alarm system to flag when and where changes occur.  相似文献   

4.
Many approaches to minutiae extraction have already been proposed for automatic fingerprint matching, and most transform fingerprint images into binary images through state-of-the-art algorithms and submit the binary image to a thinning process. However, this paper proposes an original technique for extracting minutiae based on representing the ridge structure of a fingerprint image as a run length code (RLC). The essential idea is to detect minutiae by searching for the termination points or bifurcation points of ridges in the RLC, rather than in a fingerprint image. Experimental results and a comparative analysis show that the proposed method is fairly reliable and faster than a conventional thinning-based method.  相似文献   

5.
At present, in most context-aware systems, decisions on when and how to adapt an application are made a priori by developers during the compile time. While such approaches empower developers with sufficient flexibility to specify what they want in terms of adaptation rules, they inevitably place an immense load on developers, especially in an extremely dynamic environment, to anticipate and formulate all potential run-time situations during development time. These challenges motivated us to explore an approach to automating context-aware adaptation decisions by a middleware layer at run time. The resulting middleware, CAMPUS, exploits technologies from semantic computing to dynamically derive adaptation decisions according to run-time contextual information. The CAMPUS implementation has been evaluated with a number of case applications to validate the operation of the system in a realistic environment and to provide us with an opportunity to obtain experimental results for further analysis. The results are significant in that they show that CAMPUS can effectively free developers from the need to predict, formulate, and maintain adaptation rules, thereby greatly reducing the efforts required to develop context-aware applications.  相似文献   

6.
Thierry Lafaye 《Calcolo》1998,35(1):17-36
A stabilization result is proved for two discrete age dependent SIS epidemic models (intercohort and intracohort transmission are both studied). Their dynamics depend on a threshold parameter and also on the fertility of the initial condition. Received: February 1995 / Revised version: August 1996  相似文献   

7.
We describe a method of syntax extension for programming languages which involves using an augmented BNF notation to add new syntactic constructs to the language, and where the meaning of the constructs is usually given by ordinary user subroutines to be executed at run time. This requires fewer special constructs in the language, and also makes it simpler to specify the extensions because the programmer does not have to worry about compile time vs. run time distinctions. The extended BNF may also be used to give specialized syntax to particular data structures.This research was supported under NSF Grant GJ 34342X.  相似文献   

8.
This paper presents an estimation approach for Time Event Graphs such as P-Time Event Graphs and Time Stream Event Graphs. It is assumed that the nominal behavior is known and that transitions are partitioned as observable and unobservable transitions. The technique is applied to the detection of changes which are (possibly small) finite variations of dynamic models compared to this nominal behavior. The detected changes provide indications that can be used in future maintenance operations. Using the algebra of dioids, the approach uses a receding-horizon estimation of the greatest state and analyzes the consistency of the data.  相似文献   

9.
The relationship between parameter passing mechanisms and run time data structures in languages with statically-nested scopes is examined. It is shown that simpler data structures can be used in some cases, with increased efficiency in accessing non-local variables. In particular it is true for the call-by-value-result mechanism, where the usage of displays can be eliminated altogether; however there is some additional cost associated with procedure calls. Under certain conditions the same implementation applies to call-by-reference.  相似文献   

10.
This paper investigates the optimal production run length in deteriorating production processes, where the elapsed time until the production process shifts is characterized as a fuzzy variable, also the setup cost and the holding cost are characterized as fuzzy variables, respectively. A mathematical formula representing the expected average cost per unit time is derived, and some properties are obtained to establish an efficient solution procedure. Since there is no closed-form expression for the optimal production run length, an approximate solving approach is presented. Finally, two numerical examples are given to illustrate the procedure of searching the optimal solutions.  相似文献   

11.
This paper proposes a method for detecting object classes that exhibit variable shape structure in heavily cluttered images. The term "variable shape structure" is used to characterize object classes in which some shape parts can be repeated an arbitrary number of times, some parts can be optional, and some parts can have several alternative appearances. Hidden State Shape Models (HSSMs), a generalization of Hidden Markov Models (HMMs), are introduced to model object classes of variable shape structure using a probabilistic framework. A polynomial inference algorithm automatically determines object location, orientation, scale and structure by finding the globally optimal registration of model states with the image features, even in the presence of clutter. Experiments with real images demonstrate that the proposed method can localize objects of variable shape structure with high accuracy. For the task of hand shape localization and structure identification, the proposed method is significantly more accurate than previously proposed methods based on chamfer-distance matching. Furthermore, by integrating simple temporal constraints, the proposed method gains speed-ups of more than an order of magnitude, and produces highly accurate results in experiments on non-rigid hand motion tracking.  相似文献   

12.
Time series of discrete random variables present unique statistical challenges due to serial correlation and uneven sampling intervals. While regression models for a series of counts are well developed, only few methods are discussed for the analysis of moderate to long (e.g. from 20 to 152 observations) binary or binomial time series. This article suggests generalized linear mixed models with autocorrelated random effects for a parameter-driven approach to such series. We use a Monte Carlo EM algorithm to jointly obtain maximum likelihood estimates of regression parameters and variance components. The likelihood approach, although computationally extensive, allows estimation of marginal joint probabilities of two or more serial events. These are crucial in checking the goodness-of-fit, whether the model adequately captures the serial correlation and for predicting future responses. The model is flexible enough to allow for missing observations or unequally spaced time intervals. We illustrate our approach and model assessment tools with an analysis of the series of winners in the traditional boat race between the universities of Oxford and Cambridge, re-evaluating a long-held belief about the effect of the weight of the crew on the odds of winning. We also show how our methods are useful in modeling trends based on the General Social Survey database.  相似文献   

13.
Independent component analysis using Potts models   总被引:3,自引:0,他引:3  
We explore the extending application of Potts encoding to the task of independent component analysis, which primarily deals with the problem of minimizing the Kullback-Leibler divergence between the joint distribution and the product of all marginal distributions of output components. The competitive mechanism of Potts neurons is used to encode the overlapping projections from observations to output components. Based on these projections, the marginal distributions and the entropy of output components are made tractable for computation and the adaptation of the de-mixing matrix toward independent output components is obtained. The Potts model for ICA is well formulated by an objective function subject to a set of constraints, which leads to a novel energy function. A hybrid of the mean field annealing and the gradient descent method is applied to the energy function. Our approach to independent component analysis presents a new criterion for ICA. The performance of the Potts model for ICA given by our numerical simulations is encouraging.  相似文献   

14.
This paper employs mathematical modeling for solving manufacturing run time problem with random defective rate and stochastic machine breakdown. In real life manufacturing systems, generation of nonconforming items and unexpected breakdown of production equipment are inevitable. For the purpose of addressing these practical issues, this paper studies a system that may produce defective items randomly and it is also subject to a random equipment failure. A no resumption inventory control policy is adopted when breakdown occurs. Under such a policy, the interrupted lot is aborted and malfunction machine is immediately under repair. A new lot will be started only when all on-hand inventory are depleted. Modeling and numerical analyses are used to establish the solution procedure for such a problem. As a result, the optimal manufacturing run time that minimizes the long-run average production–inventory cost is derived. A numerical example is provided to show how the solution procedure works as well as the usages of research results.  相似文献   

15.
16.
In this work, gene expression time series models have been constructed by using principal component analysis (PCA) and neural network (NN). The main contribution of this paper is to develop a methodology for modeling numerical gene expression time series. The PCA-NN prediction models are compared with other popular continuous prediction methods. The proposed model can give us the extracted features from the gene expressions time series and the orders of the prediction accuracies. Therefore, the model can help practitioners to gain a better understanding of a cell cycle, and to find the dependency of genes, which is useful for drug discoveries. Based on the results of two public real datasets, the PCA-NN method outperforms the other continuous prediction methods. In the time series model, we adapt Akaike's information criteria (AIC) tests and cross-validation to select a suitable NN model to avoid the overparameterized problem.  相似文献   

17.
18.
For inventory problems with stock-out under probabilistic demands, most of the published data assumes that the average shortages are very small and thus are neglected. However, some of the stock-out in practice might be significant and is back ordered and filled as soon as an adequate size of replenishment arrives. Typically, a supplier will institute an emergency expediting order to obtain the item when a shortage occurs and a price discount can always be offered on the stock-out item in order to secure more back orders. In this article, an inventory model with negotiable back orders is first proposed, then another model where lead time is also subject to change is discussed. Numerical examples are included to illustrate the procedures of the solution.  相似文献   

19.
Mnemonics is a Scala library for generating method bodies in JVM bytecode at run time. Mnemonics supports a large subset of the JVM instructions, for which the static typing of the generator guarantees the well-formedness of the generated bytecode.  相似文献   

20.
Two new forecasting methods of time series are introduced. They are both based on a factorial analysis method called spline principal component analysis with respect to instrumental variables (spline PCAIV). The first method is a straightforward application of spline PCAIV while the second one is an adaptation of spline PCAIV. In the modified version, the used criteria according to the unknown value that need to be predicted are differentiated. Those two forecasting methods are shown to be well adapted to time series.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号