共查询到20条相似文献,搜索用时 15 毫秒
1.
A key aspect of resource management is efficient and effective deployment of available resources whenever needed. The issue typically covers two areas: monitoring of resources used by software systems and managing the consumption of resources. A key aspect of each monitoring system is its reconfigurability – the ability of a system to limit the number of resources monitored at a given time to those that are really necessary at any particular moment. The authors of this article propose a fully dynamic and reconfigurable monitoring system based on the concept of Adaptable Aspect-Oriented Programming (AAOP) in which a set of AOP aspects is used to run an application in a manner specified by the adaptability strategy. The model can be used to implement systems that are able to monitor an application and its execution environment and perform actions such as changing the current set of resource management constraints applied to an application if the application/environment conditions change. Any aspect that implements a predefined interface may be used by the AAOP-based monitoring system as a source of information. The system utilizes the concept of dynamic AOP, meaning that the aspects (which are sources of information) may be dynamically enabled/disabled. 相似文献
2.
Uncertain data in databases were originally denoted as null values, which represent the meaning of ‘values unknown at present.” Null values were generalized into partial values, which correspond to a set of possible values, to provide a more powerful notion. In this paper, we derive some properties to refine partial values into more informative ones. In some cases, they can even be refined into definite values. Such a refinement is possible when there exist range constraint on attribute domains, or referential integrities, functional dependencies, or multivalued dependencies among attributes.
Our work actually eliminates redundant elements in a partial value. By this process, we not only provide a more concise and informative answer to users, but also speedup the computation of queries issued afterward. Besides, it reduces the communication cost when imprecise data are requested to be transmitted from one site to another site in a distributed environment. 相似文献
3.
Jan A. Spriet 《Computers in Industry》1981,2(2):115-121
In biotechnology, few online measurements of biological variables are available. This fact hampers effective monitoring and control in the fermentation field. A novel approach to this is the use of a computer to combine outputs of online sensors for physical and chemical parameters to estimate directly otherwise unaccessible biological quantities like biomass. Here, a sensor configuration is presented for the online computation of the oxygen uptake rate. The equations that have to be implemented on the digital machine are derived and a case study on the fermentation of the antibiotic Gramicidine-S is performed and discussed.The experiments indicate that for a major part of the growth and fermentation period, the computer is suitable for the evaluation of the biomass based on oxygen uptake estimates. Consequently, the process can be monitored online. 相似文献
4.
Luciano Sánchez José Otero Inés Couso 《Soft Computing - A Fusion of Foundations, Methodologies and Applications》2009,13(5):467-479
Backfitting of fuzzy rules is an Iterative Rule Learning technique for obtaining the knowledge base of a fuzzy rule-based
system in regression problems. It consists in fitting one fuzzy rule to the data, and replacing the whole training set by
the residual of the approximation. The obtained rule is added to the knowledge base, and the process is repeated until the
residual is zero, or near zero. Such a design has been extended to imprecise data for which the observation error is small.
Nevertheless, when this error is moderate or high, the learning can stop early. In this kind of algorithms, the specificity
of the residual might decrease when a new rule is added. There may happen that the residual grows so wide that it covers the
value zero for all points (thus the algorithm stops), but we have not yet extracted all the information available in the dataset.
Focusing on this problem, this paper is about datasets with medium to high discrepancies between the observed and the actual
values of the variables, such as those containing missing values and coarsely discretized data. We will show that the quality
of the iterative learning degrades in this kind of problems, because it does not make full use of all the available information.
As an alternative to sequentially obtaining rules, we propose a new multiobjective Genetic Cooperative Competitive Learning
(GCCL) algorithm. In our approach, each individual in the population codifies one rule, which competes in the population in
terms of maximum coverage and fitting, while the individuals in the population cooperate to form the knowledge base.
相似文献
Luciano Sánchez (Corresponding author)Email: |
José OteroEmail: |
Inés CousoEmail: |
5.
On today’s multiprocessor systems, simultaneously executing multi-threaded applications contend for cache space and CPU time. This contention can be managed by changing application thread count. In this paper, we describe a technique to configure thread count using utility models. A utility model predicts application performance given its thread count and other workload thread counts. Built offline with linear regression, utility models are used online by a system policy to dynamically configure applications’ thread counts. We present a policy which uses the models to maximize throughput while maintaining QoS. Our approach improves system throughput by 6 % and meets QoS 22 % more often than the best evaluated traditional policy. 相似文献
6.
Online learning with hidden markov models 总被引:1,自引:0,他引:1
We present an online version of the expectation-maximization (EM) algorithm for hidden Markov models (HMMs). The sufficient statistics required for parameters estimation is computed recursively with time, that is, in an online way instead of using the batch forward-backward procedure. This computational scheme is generalized to the case where the model parameters can change with time by introducing a discount factor into the recurrence relations. The resulting algorithm is equivalent to the batch EM algorithm, for appropriate discount factor and scheduling of parameters update. On the other hand, the online algorithm is able to deal with dynamic environments, i.e., when the statistics of the observed data is changing with time. The implications of the online algorithm for probabilistic modeling in neuroscience are briefly discussed. 相似文献
7.
The purpose of this paper is to report an approach to the development of an online inventory analysis and control system. Attention has been focused on the model aspects of the development process. Inventory Analysis and Control System 1 (IACS 1) is a series of inventory analysis models which is designed to provide management and functional personnel with the analytical tools for inventory decisions on purchasing, production and material control. IACS 1 contains self-explanatory procedures from the selection of a model and definition of the data input to execution of the program and the interpretation of the output result. 相似文献
8.
Frédéric Claux Loïc Barthe David Vanderhaeghe Jean‐Pierre Jessel Mathias Paulin 《Computer Graphics Forum》2014,33(2):263-272
We propose a versatile pipeline to render B‐Rep models interactively, precisely and without rendering‐related artifacts such as cracks. Our rendering method is based on dynamic surface evaluation using both tesselation and ray‐casting, and direct GPU surface trimming. An initial rendering of the scene is performed using dynamic tesselation. The algorithm we propose reliably detects then fills up cracks in the rendered image. Crack detection works in image space, using depth information, while crack‐filling is either achieved in image space using a simple classification process, or performed in object space through selective ray‐casting. The crack filling method can be dynamically changed at runtime. Our image space crack filling approach has a limited runtime cost and enables high quality, real‐time navigation. Our higher quality, object space approach results in a rendering of similar quality than full‐scene ray‐casting, but is 2 to 6 times faster, can be used during navigation and provides accurate, reliable rendering. Integration of our work with existing tesselation‐based rendering engines is straightforward. 相似文献
9.
Imagine your big brother habitually following you around "for your own good" and snooping into everything that you, a mature adult, choose to do. Now imagine discovering that he couldn't do so if you wore blue socks rather than brown. Wouldn't you switch to blue socks as a matter of principle? Online monitoring today presents a similar situation. You can indeed modify your online conduct to preserve your privacy almost as easily as you can change socks. 相似文献
10.
Multi-label text classification is an increasingly important field as large amounts of text data are available and extracting relevant information is important in many application contexts. Probabilistic generative models are the basis of a number of popular text mining methods such as Naive Bayes or Latent Dirichlet Allocation. However, Bayesian models for multi-label text classification often are overly complicated to account for label dependencies and skewed label frequencies while at the same time preventing overfitting. To solve this problem we employ the same technique that contributed to the success of deep learning in recent years: greedy layer-wise training. Applying this technique in the supervised setting prevents overfitting and leads to better classification accuracy. The intuition behind this approach is to learn the labels first and subsequently add a more abstract layer to represent dependencies among the labels. This allows using a relatively simple hierarchical topic model which can easily be adapted to the online setting. We show that our method successfully models dependencies online for large-scale multi-label datasets with many labels and improves over the baseline method not modeling dependencies. The same strategy, layer-wise greedy training, also makes the batch variant competitive with existing more complex multi-label topic models. 相似文献
11.
Hardware task scheduling and placement at runtime plays a crucial role in achieving better system performance by exploring dynamically reconfigurable Field-Programmable Gate Arrays (FPGAs). Although a number of online algorithms have been proposed in the literature, no strategy has been engaged in efficient usage of reconfigurable resources by orchestrating multiple hardware versions of tasks. By exploring this flexibility, on one hand, the algorithms can be potentially stronger in performance; however, on the other hand, they can suffer much more runtime overhead in selecting dynamically the best suitable variant on-the-fly based on its runtime conditions imposed by its runtime constraints. In this work, we propose a fast efficient online task scheduling and placement algorithm by incorporating multiple selectable hardware implementations for each hardware request; the selections reflect trade-offs between the required reconfigurable resources and the task runtime performance. Experimental studies conclusively reveal the superiority of the proposed algorithm in terms of not only scheduling and placement quality but also faster runtime decisions over rigid approaches. 相似文献
12.
Sez Atamturktur Zhifeng Liu Scott Cogan Hsein Juang 《Structural and Multidisciplinary Optimization》2015,51(3):659-671
Traditionally, model calibration is formulated as a single objective problem, where fidelity to measurements is maximized by adjusting model parameters. In such a formulation however, the model with best fidelity merely represents an optimum compromise between various forms of errors and uncertainties and thus, multiple calibrated models can be found to demonstrate comparable fidelity producing non-unique solutions. To alleviate this problem, the authors formulate model calibration as a multi-objective problem with two distinct objectives: fidelity and robustness. Herein, robustness is defined as the maximum allowable uncertainty in calibrating model parameters with which the model continues to yield acceptable agreement with measurements. The proposed approach is demonstrated through the calibration of a finite element model of a steel moment resisting frame. 相似文献
13.
14.
15.
16.
《Engineering Applications of Artificial Intelligence》2007,20(1):25-36
The quality of thermomechanical pulp (TMP) is influenced by a large number of variables. To control the pulp and paper process, the operator has to manually choose the influencing variables, which can change significantly depending on the quality of the raw material (wood chips). Very little knowledge exists about the relationships between the quality of the pulp obtained by the TMP process and wood chip properties. The research proposed in this paper uses genetically generated knowledge bases to model these relationships while using measurements of wood chip quality, process parameter data and properties of raw material such as bleaching agents. The rule base of the knowledge bases will provide a better understanding of the relationships between the different influencing variables (input and outputs). 相似文献
17.
Mangion AZ Yuan K Kadirkamanathan V Niranjan M Sanguinetti G 《Neural computation》2011,23(8):1967-1999
We present a variational Bayesian (VB) approach for the state and parameter inference of a state-space model with point-process observations, a physiologically plausible model for signal processing of spike data. We also give the derivation of a variational smoother, as well as an efficient online filtering algorithm, which can also be used to track changes in physiological parameters. The methods are assessed on simulated data, and results are compared to expectation-maximization, as well as Monte Carlo estimation techniques, in order to evaluate the accuracy of the proposed approach. The VB filter is further assessed on a data set of taste-response neural cells, showing that the proposed approach can effectively capture dynamical changes in neural responses in real time. 相似文献
18.
Yang Sijia Xiong Haoyi Zhang Yunchao Ling Yi Wang Licheng Xu Kaibo Sun Zeyi 《Applied Intelligence》2022,52(3):3103-3117
Applied Intelligence - Gaussian Graphical Model is widely used to understand the dependencies between variables from high-dimensional data and can enable a wide range of applications such as... 相似文献
19.
Artières T Marukatat S Gallinari P 《IEEE transactions on pattern analysis and machine intelligence》2007,29(2):205-217
We investigate a new approach for online handwritten shape recognition. Interesting features of this approach include learning without manual tuning, learning from very few training samples, incremental learning of characters, and adaptation to the user-specific needs. The proposed system can deal with two-dimensional graphical shapes such as Latin and Asian characters, command gestures, symbols, small drawings, and geometric shapes. It can be used as a building block for a series of recognition tasks with many applications 相似文献
20.
Doron Drusinsky 《Innovations in Systems and Software Engineering》2017,13(1):67-79
Runtime Monitoring (RM), also known as Runtime Verification (RV), is the process of monitoring and verifying the sequencing and temporal behavior of an underlying application and comparing it to the correct behavior as specified by a formal specification pattern. Hidden Markov Model (HMM)-based RM enables the monitoring of systems with both visible and hidden data, using the same formal specifications used by deterministic RM. This paper describes an online library of formal specification oracles and an accompanying toolset for the runtime monitoring log-files that contain hidden and visible data. 相似文献