This paper evaluates the promises of service reusability through an analysis of services implemented in a manufacturing enterprise. A total of 103 services implemented in the case enterprise are analyzed to understand the enablers and obstacles that have led to the reuse ratio of 13 %. The main identified enabler for reusable services was the capability to define the services as a part of reusable business concepts, which is aligned with some of the earlier studies on the adoption of service-oriented architecture in enterprises. The main reason for having overlapping services in the case enterprise was lagging migration of legacy services to use newer, reusable services. The results can be used to develop service engineering methodologies for better reusability, and the paper provides practical guidelines to help in the application of integration development efforts toward reusable services. 相似文献
Uncertainty in poker stems from two key sources, the shuffled deck and an adversary whose strategy is unknown. One approach
to playing poker is to find a pessimistic game-theoretic solution (i.e., a Nash equilibrium), but human players have idiosyncratic
weaknesses that can be exploited if some model or counter-strategy can be learned by observing their play. However, games
against humans last for at most a few hundred hands, so learning must be very fast to be useful. We explore two approaches
to opponent modelling in the context of Kuhn poker, a small game for which game-theoretic solutions are known. Parameter estimation
and expert algorithms are both studied. Experiments demonstrate that, even in this small game, convergence to maximally exploitive
solutions in a small number of hands is impractical, but that good (e.g., better than Nash) performance can be achieved in
as few as 50 hands. Finally, we show that amongst a set of strategies with equal game-theoretic value, in particular the set
of Nash equilibrium strategies, some are preferable because they speed learning of the opponent’s strategy by exploring it
more effectively.
Electronic Supplementary Material The online version of this article () contains supplementary material, which is available to authorized users. 相似文献
Ni–Mn–Ga is a magnetic shape memory (MSM) alloy that can strain up to 6 % when a magnetic field is applied to it. By applying a localized magnetic field to the MSM element, the strain can be precisely controlled and manipulated. By using Ni–Mn–Ga and a local magnetic field, an MSM micropump that is capable of controlling the flow within a microfluidic system has been developed. A computational fluid dynamics analysis illustrates the flow of the liquid at the outlet of the micropump and will be used to optimize future models of the pump. The performance of the MSM micropump, such as its flow rate and pumping pressure, is measured and presented in this study. Beyond its performance, there are also several advantages intrinsic to the MSM micropump. It is controlled by a magnetic field and is therefore contact-free. Depending upon the magnetic field, the MSM micropump can act as either a valve or a reversible pump. It is self-priming and capable of pumping gases as well as viscous liquids, and it has a simple design which consists primarily of the MSM alloy itself. Coupled with its scalability, it is clear that the MSM micropump is a strong candidate for an integratable flow control solution. 相似文献
This introductory paper to the special issue on Data Mining Lessons Learned presents lessons from data mining applications, including experience from science, business, and knowledge management in a collaborative data mining setting. 相似文献
The ‘Epistemic Uncertainty Workshop’ sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6–7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster–Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of multiple uncertainty representations, dependence and independence, model uncertainty, solution of black-box problems, efficient sampling strategies for computation, and communication of analysis results. 相似文献
A design method is described to realize narrow-band stripline or microstrip bandpass filters having one transmission zero near to the upper and lower bandedge. The filters utilize capacitively coupled open half-wave TEM-line resonators. Two nonadjacent resonators are also inductively coupled. No short circuits are needed. The design is based on the approximate or exact synthesis of the lowpass protoype presented by Levy. The transmission line parameters are obtained easily from the design formulae which have been derived using the susceptance (or reactance) slope parameter technique. The filter performance is good up to 12 per cent bandwidth. Measured results show good agreement with the theoretical ones. 相似文献
Virtual reality (VR) is considered as one of the technological megatrends of 2020s, and today, VR systems are used in various settings, digital gaming being among the most popular ones. However, there has been a dearth of understanding regarding the central factors behind VR gaming acceptance and use. The present study therefore aimed to explain the factors that drive the use and acceptance of VR games. We extended the hedonic-motivation system acceptance model with utilitarian and inconvenience factors to capture the pertinent features of VR systems more holistically. We proposed a theoretical model and analyzed it through covariance-based structural equation modeling using an online survey sample of 473 VR gamers. Our findings help explain the role of different antecedents behind VR gaming acceptance and demonstrate that VR gaming is driven more by the hedonic gaming aspects than by the utilitarian health and well-being aspects of VR games, enjoyment being the strongest driver behind VR gaming intention and immersion. Moreover, findings also suggested that use intentions and immersion levels are not significantly diminished by physical discomfort and VR sickness. The findings, which potentially extend to other VR systems as well, also pose important implications for the providers of VR games. As the main contribution, based on our empirical findings, we provide a greater theoretical understanding on VR gaming acceptance and use.
Use continuance is crucial in terms of information systems (IS) success. Previous research has shown that situational context can be central for IS use continuance but has paid limited attention to its specific characteristics. Furthermore, the link between situational context and use continuance has remained unexplored in the novel area of “exergames,” which are defined as digital games requiring physical effort from the player that determines the outcome of the game. Studying exergames is deemed important due to their potential in providing health benefits for users, revenues for providers, and well‐being for societies. However, their potential remains unreached because users tend to discontinue usage after their initial experiences. To address these gaps, we investigate the relationships between specific situational characteristics and use continuance after critical exergaming incidents, in which the user has an exceptionally positive or negative experience. To do this, we quantitatively and qualitatively examined 461 actual critical exergaming incidents. Our findings provide a greater understanding of IS use continuance by revealing new knowledge about the relationships between specific situational characteristics (ie, purpose of use, type of gaming platform, social setting, place, and exertion level) and use continuance. We also offer explanations for these relationships, thus providing a first understanding of the previously unmapped area of how users behave situation‐dependently after critical exergaming incidents. Thus, we contribute both to the general IS continuance literature as well as to the specific area of exergaming. The context specificity of our study matches the calls for heavily contextualised IS research. 相似文献
We introduce the sticker systems, a computability model, which is an abstraction of the computations using the Watson-Crick complementarity as in Adleman's
DNA computing experiment, [1]. Several types of sticker systems are shown to characterize (modulo a weak coding) the regular
languages, hence the power of finite automata. One variant is proven to be equivalent to Turing machines. Another one is found
to have a strictly intermediate power.
Received: 10 October 1996 / 16 April 1997 相似文献
In the present work pyrolysis of pure pine wood and softwood carbohydrates, namely cellulose and galactoglucomannan (the major hemicellulose in coniferous wood), was conducted in a batch mode operated fluidized bed reactor. Temperature ramping (5 °C/min) was applied to the heating until a reactor temperature of 460 °C was reached. Thereafter the temperature was kept until the release of non-condensable gases stopped. The different raw materials gave significantly different bio-oils. Levoglucosan was the dominant product in the cellulose pyrolysis oil. Acetic acid was found in the highest concentrations in both the galactoglucomannan and in the pine wood pyrolysis oils. Acetic acid is most likely formed by removal of O-acetyl groups from mannose units present in GGM structure. 相似文献