首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In an experimental design, we tested whether written warnings can reduce the amount of identity information exposure online. A psychological attack on information privacy that has been shown to be effective in previous research was launched. This attack took advantage of the fact that people respond to certain types of requests in a relatively automatic, or mindless, fashion. The experiment manipulated the word that was used in the alert header: “warning”, “caution”, or “hazard”. All warnings proved to be effective in reducing disclosure, but “hazard” proved to be most effective. Also warnings were more effective in reducing disclosure of driver's license numbers than email addresses. The discussion (a) provides tentative conclusions why these patterns were obtained, (b) suggests how to design warnings in cyber-environments, and (c) addresses future possibilities for research on this topic.  相似文献   

2.
Standard ML of New Jersey (SML–NJ) uses “weak type variables” to restrict the polymorphic use of functions that may allocate reference cells, manipulate continuations, or use exceptions. However, the type system used in the SML–NJ compiler has not previously been presented in a form other than source code nor proved correct. We present a set of typing rules, based on analysis of the concepts underlying “weak polymorphism”, that appears to subsume the implemented algorithm and uses type variables of only a slightly more general nature than the compiler. One insight in the analysis is that allowing a variable to occur both “ordinarily” and “weakly” in a type permits a simpler and more flexible formulation of the typing rules. In particular, we are able to treat applications of polymorphic functions to imperative arguments with greater flexibility than SML–NJ. The soundness of the type system is proved for imperative code using operational semantics, by showing that evaluation preserves typability. By incorporating assumptions about memory addresses in the type system, we avoid proofs by co-induction.  相似文献   

3.
Previous studies have sought insights into how websites can effectively draw sustained attention from internet users. Do different types of information presentations on webpages have different influences on users’ perceptions of the information? More precisely, can combinations of an ever greater number of advertising elements on individual websites increase consumers’ purchase intentions? The aim of this study is to explore changes in web advertising’s verbal and visual stimulation of surfers’ cognitive process, and to provide valuable information for the successful matching of advertising elements to one another. We examine optimal website design according to the personality-trait theory and resource-matching theory. Study 1 addresses the effects that combinations of various types of online advertising can have on web design factor, and to this end, we use a 2 (visual complexity: 3D advertising with an avatar, 2D advertising) × 2 (verbal complexity: with or without self-referencing that is an advertising practice to express product claims in words) factorial design. Study 2 treats personality traits (i.e., need-for-cognition and sensation seeking) as moderating variables to build the optimal portfolio regarding the “online-advertising effects” hypothesis. Our results suggest that subjects prefer medium-complex advertising comprising “3D advertising elements with an avatar” or “2D advertising elements with self-referencing”: high-sensation seekers and low-need-for-cognition viewers prefer the former, whereas low-sensation seekers and high-need-for-cognition viewers prefer the latter.  相似文献   

4.
In the paper, we develop an EPQ (economic production quantity) inventory model to determine the optimal buffer inventory for stochastic demand in the market during preventive maintenance or repair of a manufacturing facility with an EPQ (economic production quantity) model in an imperfect production system. Preventive maintenance, an essential element of the just-in-time structure, may cause shortage which is reduced by buffer inventory. The products are sold with the free minimal repair warranty (FRW) policy. The production system may undergo “out-of-control” state from “in-control” state, after a certain time that follows a probability density function. The defective (non-conforming) items in “in-control” or “out-of-control” state are reworked at a cost just after the regular production time. Finally, an expected cost function regarding the inventory cost, unit production cost, preventive maintenance cost and shortage cost is minimized analytically. We develop another case where the buffer inventory as well as the production rate are decision variables and the expected unit cost considering the above cost functions is optimized also. The numerical examples are provided to illustrate the behaviour and application of the model. Sensitivity analysis of the model with respect to key parameters of the system is carried out.  相似文献   

5.
The construction of a new generation of MEMS which includes micro-assembly steps in the current microfabrication process is a big challenge. It is necessary to develop new production means named micromanufacturing systems in order to perform these new assembly steps. The classical approach called “top-down” which consists in a functional analysis and a definition of the tasks sequences is insufficient for micromanufacturing systems. Indeed, the technical and physical constraints of the microworld (e.g. the adhesion phenomenon) must be taken into account in order to design reliable micromanufacturing systems. A new method of designing micromanufacturing systems is presented in this paper. Our approach combines the general “top-down” approach with a “bottom-up” approach which takes into account technical constraints. The method enables to build a modular architecture for micromanufacturing systems. In order to obtain this modular architecture, we have devised an original identification technique of modules and an association technique of modules. This work has been used to design the controller of an experimental robotic micro-assembly station.  相似文献   

6.
It is quite difficult but essential for Genetic Programming (GP) to evolve the choice structures. Traditional approaches usually ignore this issue. They define some “if-structures” functions according to their problems by combining “if-else” statement, conditional criterions and elemental functions together. Obviously, these if-structure functions depend on the specific problems and thus have much low reusability. Based on this limitation of GP, in this paper we propose a kind of termination criterion in the GP process named “Combination Termination Criterion” (CTC). By testing CTC, the choice structures composed of some basic functions independent to the problems can be evolved successfully. Theoretical analysis and experiment results show that our method can evolve the programs with choice structures effectively within an acceptable additional time.  相似文献   

7.
Data derived from gene expression microarrays often are used for purposes of classification and discovery. Many methods have been proposed for accomplishing these and related aims, however the statistical properties of such methods generally are not well established. To this end, it is desirable to develop realistic mathematical and statistical models that can be used in a simulation context so that the impacts of data analysis methods and testing approaches can be established. A method is developed in which variation among arrays can be characterized simultaneously for a large number of genes resulting in a multivariate model of gene expression. The method is based on selecting mathematical transformations of the underlying expression measures such that the transformed variables follow approximately a Gaussian distribution, and then estimating associated parameters, including correlations. The result is a multivariate normal distribution that serves to model transformed gene expression values within a subject population, while accounting for covariances among genes and/or probes. This model then is used to simulate microarray expression and probe intensity data by employing a modified Cholesky matrix factorization technique which addresses the singularity problem for the “small n, big p” situation. An example is given using prostate cancer data and, as an illustration, it is shown how data normalization can be investigated using this approach.  相似文献   

8.
Model predictive control (MPC) is a popular controller design technique in the process industry. Recently, MPC has been extended to a class of discrete event systems that can be described by a model that is “linear” in the max-plus algebra. In this context both the perturbations-free case and for the case with noise and/or modeling errors in a bounded or stochastic setting have been considered. In each of these cases an optimization problem has to be solved on-line at each event step in order to determine the MPC input. This paper considers a method to reduce the computational complexity of this optimization problem, based on variability expansion. In particular, it is shown that the computational load is reduced if one decreases the level of “randomness” in the system.  相似文献   

9.
Using Wang–Landau sampling with suitable Monte Carlo trial moves (pull moves and bond-rebridging moves combined) we have determined the density of states and thermodynamic properties for a short sequence of the HP protein model. For free chains these proteins are known to first undergo a collapse “transition” to a globule state followed by a second “transition” into a native state. When placed in the proximity of an attractive surface, there is a competition between surface adsorption and folding that leads to an intriguing sequence of “transitions”. These transitions depend upon the relative interaction strengths and are largely inaccessible to “standard” Monte Carlo methods.  相似文献   

10.
Finding the product of two polynomials is an essential and basic problem in computer algebra. While most previous results have focused on the worst-case complexity, we instead employ the technique of adaptive analysis to give an improvement in many “easy” cases. We present two adaptive measures and methods for polynomial multiplication, and also show how to effectively combine them to gain both advantages. One useful feature of these algorithms is that they essentially provide a gradient between existing “sparse” and “dense” methods. We prove that these approaches provide significant improvements in many cases but in the worst case are still comparable to the fastest existing algorithms.  相似文献   

11.
For the FOE estimation, there are basically three kinds of estimation methods in the literature: algebraic, geometric, and the maximum likelihood-based ones. In this paper, our attention is focused on the geometric method. The computational complexity of the classical geometric method is usually very high because it needs to solve a non-linear minimum problem with many variables. In this work, such a minimum problem is converted into an equivalent one with only two variables and accordingly a simplified geometric method is proposed. Based on the equivalence of the classical geometric method and the proposed simplified geometric method, we show that the measurement errors can at most be “corrected” only in one of the two images by geometric methods. In other words, it is impossible to correct the measurement errors in both of the two images. In addition, we show that the “corrected” corresponding pairs by geometric methods cannot in general meet some of the inherent constraints of corresponding pairs under pure camera translations. Hence, it is not proper to consider the “corrected” corresponding pairs as “faithful” corresponding pairs in geometric methods, and the estimated FOE from such pairs is not necessarily trustworthier. Finally, a new geometric algorithm, which automatically enforces the inherent constraints, is proposed in this work, and better FOE estimation and more faithful corresponding pairs are obtained.  相似文献   

12.
Practical training is what brings imagination and creativity to fruition, which relies significantly on the relevant technical skills needed. Thus, the current study has placed its emphasis on strengthening the learning of technical skills with emerging innovations in technology, while studying the effects of employing such technologies at the same time. As for the students who participated in the study, technical skills had been cultivated in the five dimensions of knowledge, comprehension, simulation, application, and creativity, in accordance to the set teaching objectives and the taxonomy for students learning outcome, while the virtual reality learning environment (VRLE) has also been developed to meet different goals as the various technical skills were being examined. In terms of the nature of technology, operation of machines, selection of process parameters, and process planning in technical skills, VRLE has also designed the six modules of “learning resource”, “digital content”, “collaborative learning”, “formative evaluation”, “simulation of manufacturing process”, and “practical exercise” in particular for providing students with assistance in the development on their technical skills on a specific, gradual basis. After assessing the technical skills that have been developed for the time period of one semester, the students have reported finding VRLE to be a significantly effective method when considering the three dimensions of “operation of machines”, “selection of process parameter”, and “process planning”, though not so much so when it came to the dimension of “nature of technology”. Among the six modules, “simulation of manufacturing process” and “practical exercise” were the two that were most preferred by students for the three dimensions considered.  相似文献   

13.
This research builds on prior work on developing near optimal solutions to the product line design problems within the conjoint analysis framework. In this research, we investigate and compare different genetic algorithm operators; in particular, we examine systematically the impact of employing alternative population maintenance strategies and mutation techniques within our problem context. Two alternative population maintenance methods, that we term “Emigration” and “Malthusian” strategies, are deployed to govern how individual product lines in one generation are carried over to the next generation. We also allow for two different types of reproduction methods termed “Equal Opportunity” in which the parents to be paired for mating are selected with equal opportunity and a second based on always choosing the best string in the current generation as one of the parents which is referred to as the “Queen bee”, while the other parent is randomly selected from the set of parent strings. We also look at the impact of integrating the artificial intelligence approach with a traditional optimization approach by seeding the GA with solutions obtained from a Dynamic Programming heuristic proposed by others. A detailed statistical analysis is also carried out to determine the impact of various problem and technique aspects on multiple measures of performance through means of a Monte Carlo simulation study. Our results indicate that such proposed procedures are able to provide multiple “good” solutions. This provides more flexibility for the decision makers as they now have the opportunity to select from a number of very good product lines. The results obtained using our approaches are encouraging, with statistically significant improvements averaging 5% or more, when compared to the traditional benchmark of the heuristic dynamic programming technique.  相似文献   

14.
In this paper, two significant weaknesses of locally linear embedding (LLE) applied to computer vision are addressed: “intrinsic dimension” and “eigenvector meanings”. “Topological embedding” and “multi-resolution nonlinearity capture” are introduced based on mathematical analysis of topological manifolds and LLE. The manifold topological analysis (MTA) method is described and is based on “topological embedding”. MTA is a more robust method to determine the “intrinsic dimension” of a manifold with typical topology, which is important for tracking and perception understanding. The manifold multi-resolution analysis (MMA) method is based on “multi-resolution nonlinearity capture”. MMA defines LLE eigenvectors as features for pattern recognition and dimension reduction. Both MTA and MMA are proved mathematically, and several examples are provided. Applications in 3D object recognition and 3D object viewpoint space partitioning are also described.  相似文献   

15.
Content-based image suggestion (CBIS) addresses the satisfaction of users long-term needs for “relevant” and “novel” images. In this paper, we present VCC-FMM, a flexible mixture model that clusters both images and users into separate groups. Then, we propose long-term relevance feedback to maintain accurate modeling of growing image collections and changing user long-term needs over time. Experiments on a real data set show merits of our approach in terms of image suggestion accuracy and efficiency.  相似文献   

16.
This study presents a data-driven and semiautomatic classification system carried out by object-based image analysis and fuzzy logic in a selected landslide-prone area in the Western Black Sea region of Turkey. In the first stage, a multiresolution segmentation process was performed using Landsat ETM+ satellite images of the study area. The model was established on 5235 image objects obtained by the segmentation process. A total of 70 landslide locations and 10 input parameters including normalized difference vegetation index, slope angle, curvature, brightness, mean band blue, asymmetry, shape index, length/width ratio, gray level co-occurrence matrix, and mean difference to infrared band were considered in the analyses. Membership functions were used to classify the study area by five fuzzy operators such as “and”, “or”, “mean arithmetic”, “mean geometric”, and “algebraic product”. In order to assess the performances of the so-produced maps, 700 image objects, which were not used in the model, were taken into consideration. Based on the results, the map produced by “fuzzy and” operator performed better than those classified by the other fuzzy operators. The proposed methodology applied in this study may be useful for decision makers, local administrations, and scientists interested in landslides. It may also be useful in landslide-prone areas for planning, management, and regional development purposes.  相似文献   

17.
Use cases constitute a popular technique to problem analysis, partly due to their focus on thinking in terms of the user needs. However this is not a guarantee for discovering all the subproblems that compose the structure of a given software problem. Moreover, a rigorous application of the technique requires a previous consensus about the meaning of I. Jacobson's statement “a use case must give a measurable value to a particular actor” (The Rational Edge, March 2003). This paper proposes a particular characterisation of the concept of “value” with the purpose of problem structuring. To this aim we base on the catalogue of frames for real software problems proposed by M. Jackson (Problem Frames, 2001) and we reason about what could be valuable for the user on each problem class. We illustrate our technique with the analysis of a web auction problem.  相似文献   

18.
Many companies have adopted Process-aware Information Systems (PAIS) to support their business processes in some form. On the one hand these systems typically log events (e.g., in transaction logs or audit trails) related to the actual business process executions. On the other hand explicit process models describing how the business process should (or is expected to) be executed are frequently available. Together with the data recorded in the log, this situation raises the interesting question “Do the model and the log conform to each other?”. Conformance checking, also referred to as conformance analysis, aims at the detection of inconsistencies between a process model and its corresponding execution log, and their quantification by the formation of metrics. This paper proposes an incremental approach to check the conformance of a process model and an event log. First of all, the fitness between the log and the model is measured (i.e., “Does the observed process comply with the control flow specified by the process model?”). Second, the appropriateness of the model can be analyzed with respect to the log (i.e., “Does the model describe the observed process in a suitable way?”). Appropriateness can be evaluated from both a structural and a behavioral perspective. To operationalize the ideas presented in this paper a Conformance Checker has been implemented within the ProM framework, and it has been evaluated using artificial and real-life event logs.  相似文献   

19.
This article seeks to extend the concept of “post-critical composition” through an analysis of two MEmorials, the post-critical genre Gregory Ulmer [Ulmer, Gregory L. (2005). Electronic monuments. Minneapolis: University of Minnesota Press] has been exploring for 15 years. I resist the tendency on the part of some post-critical theorists to reject the role of genres and models in favor of perpetual re-invention of genres and pedagogies with each composition. Through a combination of product analysis and process reflection, this article documents the necessary but flexible role the genre of MEmorial played in a student composition, “MEmorial for Afghanistan,” and in my own composition, “Strangers in Strange Lands: A MEmorial for the Lost Boys of Sudan*.” This essay is an extension of not only post-critical composition but also online memorialization, described by National Public Radio [National Public Radio. (2007, May 28). Online memorials to the war dead. Day to day. Retrieved June 5, 2007, from http://www.npr.org] as a “modern phenomenon” and identified by Joyce Walker [Walker, Joyce. (2007). Narratives in the database: Memorializing September 11th online. Computers and Composition 24(2), 121-153] as a potentially powerful means of encouraging “cyborg citizens.”  相似文献   

20.
This paper addresses a fundamental question: Is there a standard way of creating standards? Based on our first-hand experience of creating a technical ICT standard called H.350, we pondered over the process and reflected on what really happened. H.350 is a Directory Services for Multimedia standard ratified by the International Telecommunications Union in September 2003. Resulting from an Internet2 Video Middleware working group the new H.350 standard provides a uniform way to store and locate information related to video and voice over IP (VoIP) in directories that are linked seamlessly to enterprise directories. There were many socio-economic-technical factors that led to the creation of H.350 and we were able to organize the process into a framework, which we present here. We have combined the “public policy good” model and the “stakeholder analysis model” in standards creation into a comprehensive framework that can help the research community to better understand what goes on in standards creation. We conducted in-depth interviews with the core H.350 team to learn more about the entire process and their experience. The findings from these interviews further validate our framework. We apply the case of H.350 to our framework and help understand the forces that affect development of ICT standards.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号