Service consumers satisfaction is considered today as one of the main concern to be ensured by service providers, especially with the spread of concurrency and the increase of functionally equivalent services. This satisfaction is closely related to quality of service (QoS) perceived by service consumers. In this context, we propose an approach to determine the satisfaction degree corresponding to the QoS of service-based applications, with regard to service consumers’ QoS expectations. Our approach is based on a preference model, which is built only on the basis of service consumer’s provided information. This preference model is also based on the 2-additive Choquet operator that takes into account preferential dependencies. In this paper, we target both design time and runtime aggregation of QoS of service-based applications. 相似文献
One has a large workload that is “divisible”—its constituent work’s granularity can be adjusted arbitrarily—and one has access to p remote worker computers that can assist in computing the workload. How can one best utilize the workers? Complicating this question is the fact that each worker is subject to interruptions (of known likelihood) that kill all work in progress on it. One wishes to orchestrate sharing the workload with the workers in a way that maximizes the expected amount of work completed. Strategies are presented for achieving this goal, by balancing the desire to checkpoint often—thereby decreasing the amount of vulnerable work at any point—vs. the desire to avoid the context-switching required to checkpoint. Schedules must also temper the desire to replicate work, because such replication diminishes the effective remote workforce. The current study demonstrates the accessibility of strategies that provably maximize the expected amount of work when there is only one worker (the case p=1) and, at least in an asymptotic sense, when there are two workers (the case p=2); but the study strongly suggests the intractability of exact maximization for p≥2 computers, as work replication on multiple workers joins checkpointing as a vehicle for decreasing the impact of work-killing interruptions. We respond to that challenge by developing efficient heuristics that employ both checkpointing and work replication as mechanisms for decreasing the impact of work-killing interruptions. The quality of these heuristics, in expected amount of work completed, is assessed through exhaustive simulations that use both idealized models and actual trace data. 相似文献
Sustainable water management is a global challenge for the 21st century. One key aspect remains protection against urban flooding. The main objective is to ensure or maintain an adequate level of service for all inhabitants. However, level of service is still difficult to assess and the high-risk locations difficult to identify. In this article, we propose a methodology, which (i) allows water managers to measure the service provided by the urban drainage system with regard to protection against urban flooding; and (ii) helps stakeholders to determine effective strategies for improving the service provided. One key aspect of this work is to use a database of sewer flood event records to assess flood risk. Our methodology helps urban water managers to assess the risk of sewer flooding; this approach does not seek to predict flooding but rather to inform decision makers on the current level of risk and on actions which need to be taken to reduce the risk. This work is based on a comprehensive definition of risk, including territorial vulnerability and perceptions of urban water stakeholders. This paper presents the results and the methodological contributions from implementing the methodology on two case studies: the cities of Lyon and Mulhouse. 相似文献
Carboxymethylcellulose (CMC) and beta-cyclodextrin (beta-CD)-based polymers functionalized with two types of quaternary ammonium compounds (QACs), the alkaquat DMB-451 (N-alkyl (50% C14, 40% C12, 10% C10) dimethylbenzylammonium chloride) (DMD-451) named polymer DMB-451, and FMB 1210-8 (a blend of 32 w% N-alkyl (50% C14, 40% C12, 10% C10) dimethylbenzylammonium chloride and 48 w% of didecyldimethylammonium chloride) named polymer FMB 1210-8, were synthethized and characterized by Fourier transform infrared spectroscopy. The antimicrobial activities of these polymers against Eschericia coli were also evaluated at 25 degrees C in wastewater. The results have indicated that the polymer FMB 1210-8 possesses a high-affinity binding with bacterial cells that induces a rapid disinfection process. Moreover, in the same experimental conditions of disinfection (mixture of 1.0 g of polymer and 100 mL of wastewater), the polymer FMB 1210-8 has a higher antimicrobial efficiency (99.90%) than polymer DMB-451 (92.8%). This phenomenon might be associated to a stronger interaction with bacterial cells due to stronger binding affinity for E. coli cells and greater killing efficiency of the C10 alkyl chains QAC of polymer FMB 1210-8 to disrupt the bacterial cell membrane as compared to N-alkyl (50% C14, 40% C12, 10% C10) dimethylbenzylammonium chloride. Together, these results suggest that the polymer FMB 1210-8 could constitute a good disinfectant against Escherichia coli, which could be advantageously used in wastewater treatments due to the low toxicity of beta-CD and CMC, and moderated toxicity of FMB 1210-8 to human and environment. 相似文献
In this paper, we present a method for binary image comparison. For binary images, intensity information is poor and shape extraction is often difficult. Therefore binary images have to be compared without using feature extraction. Due to the fact that different scene patterns can be present in the images, we propose a modified Hausdorff distance (HD) locally measured in an adaptive way. The resulting set of measures is richer than a single global measure. The local HD measures result in a local-dissimilarity map (LDMap) including the dissimilarity spatial layout. A classification of the images in function of their similarity is carried out on the LDMaps using a support vector machine. The proposed method is tested on a medieval illustration database and compared with other methods to show its efficiency. 相似文献
eb
3 is a trace-based formal language created for the specification of information systems. In eb
3, each entity and association attribute is independently defined by a recursive function on the valid traces of external events.
This paper describes an algorithm that generates, for each external event, a transaction that updates the value of affected
attributes in their relational database representation. The benefits are twofold: eb
3 attribute specifications are automatically translated into executable programs, eliminating system design and implementation
steps; the construction of information systems is streamlined, because eb
3 specifications are simpler and shorter to write than corresponding traditional specifications, design and implementations.
In particular, the paper shows that simple eb
3 constructs can replace complex SQL queries which are typically difficult to write.
An original inversion method specifically adapted to the estimation of Poisson coefficient of balls by using their resonance spectra is described. From the study of their elastic vibrations, it is possible to accurately characterize the balls. The proposed methodology can create both spheroidal modes in the balls and detect such vibrations over a large frequency range. Experimentally, by using both an ultrasonic probe for the emission (piezoelectric transducer) and a heterodyne optic probe for the reception (interferometer), it was possible to take spectroscopic measurements of spheroidal vibrations over a large frequency range (100 kHz-45 MHz) in a continuous regime. This method, which uses ratios between wave resonance frequencies, allows the Poisson coefficient to be determined independently of Young's modulus and the ball's radius and density. This has the advantage of providing highly accurate estimations of Poisson coefficient (+/-4.3 x 10(-4)) over a wide frequency range. 相似文献
The joint estimation of the location vector and the shape matrix of a set of independent and identically Complex Elliptically Symmetric (CES) distributed observations is investigated from both the theoretical and computational viewpoints. This joint estimation problem is framed in the original context of semiparametric models allowing us to handle the (generally unknown) density generator as an infinite-dimensional nuisance parameter. In the first part of the paper, a computationally efficient and memory saving implementation of the robust and semiparmaetric efficient R-estimator for shape matrices is derived. Building upon this result, in the second part, a joint estimator, relying on the Tyler’s M-estimator of location and on the R-estimator of shape matrix, is proposed and its Mean Squared Error (MSE) performance compared with the Semiparametric Cramér-Rao Bound (SCRB).