首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 734 毫秒
1.
This field study investigated the application of cooperative, competitive, and individualistic goal structures in classroom use of computer math games and its impact on students’ math performance and math learning attitudes. One hundred and sixty 5th-grade students were recruited and randomly assigned to Teams–Games–Tournament cooperative gaming, interpersonal competitive gaming, individualistic gaming, and the control group. A state-standards-based math exam and an inventory on attitudes toward mathematics were used in pretest and posttest. Students’ gender and socioeconomic status were examined as the moderating variables. Results indicated that even though there was not a significant effect of classroom goal structure in reinforcing computer gaming for math test performance, game-based learning in cooperative goal structure was most effective in promoting positive math attitudes. It was also found that students with different socioeconomic statuses were influenced differently by gaming within alternative goal structures.
Fengfeng KeEmail:
  相似文献   

2.
Accuracy in processing time estimation of different manufacturing operations is fundamental to get more competitive prices and higher profits in an industry. The manufacturing times of a machine depend on several input variables and, for each class or type of product, a regression function for that machine can be defined. Time estimations are used for implementing production plans. These plans are usually supervised and modified by an expert, so information about the dependencies of processing time with the input variables is also very important. Taking into account both premises (accuracy and simplicity in information extraction), a model based on TSK (Takagi–Sugeno–Kang) fuzzy rules has been used. TSK rules fulfill both requisites: the system has a high accuracy, and the knowledge structure makes explicit the dependencies between time estimations and the input variables. We propose a TSK fuzzy rule model in which the rules have a variable structure in the consequent, as the regression functions can be completely distinct for different machines or, even, for different classes of inputs to the same machine. The methodology to learn the TSK knowledge base is based on genetic programming together with a context-free grammar to restrict the valid structures of the regression functions. The system has been tested with real data coming from five different machines of a wood furniture industry.
Manuel MucientesEmail:
  相似文献   

3.
Perception of compliant objects through a human system interface with visual–haptic feedback was investigated. Participants had to explore virtual cubes at different compliances by squeezing them with their fingers and observing them visually and haptically. The cubes were rendered by admittance control. Perception of compliance was analyzed using an adaptive staircase method. Results showed that visual–haptic perception of compliant environments is less accurate than perception of position and force stimuli. Furthermore, due to the important role of the visual feedback cross-modal comparisons are more difficult than bimodal comparisons.
Martin BussEmail:
  相似文献   

4.
Virtual reality environments (VRs) offer unique training opportunities, particularly for training astronauts and preadapting them to microgravity. The purpose of the current research was to compare disturbances in eye–head–hand (EHH) and eye–head (GAZE) sensorimotor coordination produced by repeated exposures to VR systems. In general, we observed significant increases in position errors in manual target acquisition for both horizontal and vertical targets. We also observed a significant decrement in the ability of subjects to maintain gaze on horizontal eccentric targets immediately after exposure to VR. These preliminary findings provide some direction for developing training schedules for VR users that facilitate adaptation and support the idea that VRs may serve as an analog for sensorimotor effects of spaceflight.
Deborah L. HarmEmail:
  相似文献   

5.
We couple pseudo-particle modeling (PPM, Ge and Li in Chem Eng Sci 58(8):1565–1585, 2003), a variant of hard-particle molecular dynamics, with standard soft-particle molecular dynamics (MD) to study an idealized gas–liquid flow in nano-channels. The coupling helps to keep sharp contrast between gas and liquid behaviors and the simulations conducted provide a reference frame for exploring more complex and realistic gas–liquid nano-flows. The qualitative nature and general flow patterns of the flow under such extreme conditions are found to be consistent with its macro-scale counterpart.
Wei GeEmail:
  相似文献   

6.
Both common coupling and pointer variables can exert a deleterious effect on the quality of software. The situation is exacerbated when global variables are assigned to pointer variables, that is, when an alias to a global variable is created. When this occurs, the number of global variables increases, and it becomes considerably harder to compute quality metrics correctly. However, unless aliasing is taken into account, variables may incorrectly appear to be unreferenced (neither defined nor used), or to be used without being defined. These ideas are illustrated by means of a case study of common coupling in the Linux kernel.
Stephen R. SchachEmail:
  相似文献   

7.
To determine the maximum separation between events for nonrepetitive systems with max and linear constraints, there are the “iterative tightening from above” (ITA) approach and the “iterative tightening from below” (ITB) approach. Since such systems can be formulated as systems constrained by min–max inequalities, this paper gives an algorithm named MMIMaxSep for solving min–max inequalities. The algorithm is a generalization and a mathematically elegant reformulation of Yen et al.’s MaxSeparation algorithm which uses the ITB approach. Our numerical experiments indicate that MMIMaxSep is very efficient. Moreover, MMIMaxSep has a unique advantage of being able to directly handle tree-represented min–max functions, and its complexity is closely related to the complexity of computing cycle time of min–max functions.
Yiping ChengEmail:
  相似文献   

8.
A novel neural network architecture suitable for image processing applications and comprising three interconnected fuzzy layers of neurons and devoid of any back-propagation algorithm for weight adjustment is proposed in this article. The fuzzy layers of neurons represent the fuzzy membership information of the image scene to be processed. One of the fuzzy layers of neurons acts as an input layer of the network. The two remaining layers viz. the intermediate layer and the output layer are counter-propagating fuzzy layers of neurons. These layers are meant for processing the input image information available from the input layer. The constituent neurons within each layer of the network architecture are fully connected to each other. The intermediate layer neurons are also connected to the corresponding neurons and to a set of neighbors in the input layer. The neurons at the intermediate layer and the output layer are also connected to each other and to the respective neighbors of the corresponding other layer following a neighborhood based connectivity. The proposed architecture uses fuzzy membership based weight assignment and subsequent updating procedure. Some fuzzy cardinality based image context sensitive information are used for deciding the thresholding capabilities of the network. The network self organizes the input image information by counter-propagation of the fuzzy network states between the intermediate and the output layers of the network. The attainment of stability of the fuzzy neighborhood hostility measures at the output layer of the network or the corresponding fuzzy entropy measures determine the convergence of the network operation. An application of the proposed architecture for the extraction of binary objects from various degrees of noisy backgrounds is demonstrated using a synthetic and a real life image.
Ujjwal MaulikEmail:
  相似文献   

9.
The development of requirement specifications is done by accumulating knowledge about the desired systems in a progressive manner. This process can be supported by an analysis–revision cycle, in which the analysis phase checks the correctness of a given specification, and the revision phase modifies it, in case some problems are detected. To date, the analysis and revision activities have been typically considered in isolation, resulting in ineffective support to the stakeholders’ work. In response to that, this article introduces methodologies to conduct an interactive and integrated approach, grounded on the formalization of two basic types of evolutions (refinements and retrenchments) over multi-valued specification and modeling formalisms. Evaluation results are included to show that this approach can indeed help the stakeholders identify and clarify requirements through different stages of development.
Alberto Gil-SollaEmail:
  相似文献   

10.
We present a survey of traffic models for communication networks whose key performance indicators like blocking probability and mean delay are independent of all traffic characteristics beyond the traffic intensity. This insensitivity property, which follows from that of the underlying queuing networks, is key to the derivation of simple and robust engineering rules like the Erlang formula in telephone networks.
T. BonaldEmail:
  相似文献   

11.
Inverse multi-objective robust evolutionary design   总被引:2,自引:0,他引:2  
In this paper, we present an Inverse Multi-Objective Robust Evolutionary (IMORE) design methodology that handles the presence of uncertainty without making assumptions about the uncertainty structure. We model the clustering of uncertain events in families of nested sets using a multi-level optimization search. To reduce the high computational costs of the proposed methodology we proposed schemes for (1) adapting the step-size in estimating the uncertainty, and (2) trimming down the number of calls to the objective function in the nested search. Both offline and online adaptation strategies are considered in conjunction with the IMORE design algorithm. Design of Experiments (DOE) approaches further reduce the number of objective function calls in the online adaptive IMORE algorithm. Empirical studies conducted on a series of test functions having diverse complexities show that the proposed algorithms converge to a set of Pareto-optimal design solutions with non-dominated nominal and robustness performances efficiently.
Dudy Lim (Corresponding author)Email:
Yew-Soon OngEmail:
Yaochu JinEmail:
Bernhard SendhoffEmail:
Bu Sung LeeEmail:
  相似文献   

12.
Application of a model to the evaluation of flood damage   总被引:1,自引:0,他引:1  
This paper presents the initial results of a common methodology for the evaluation of damage produced by a flood. A model has been developed for flood damage estimation based on a geographic information system (GIS). It could be used by land administration bodies and insurance companies to manage flood-related damage data. The model simulates flood scenarios and evaluates expected economic losses from the impact of floodwaters on exposed elements, through the application of a computational model elaborated by GIS. During the development of the model, the Boesio Stream, a small watercourse flowing into Lake Maggiore (Lombardy, northern Italy) which was recently affected by a flash flood, was used as case study to test and calibrate the methodology. The method could be used either as a forecasting tool to define event scenarios, utilizing data from events simulated with a hydraulic model, or for real-time damage assessment after a disaster. The approach is suitable to large-area damage assessment and could be appropriate for land use planning, civil protection and risk mitigation.
F. LuinoEmail:
  相似文献   

13.
The base of all training in interventional radiology aims at the development of the core skills in manipulating the instruments. Computer simulators are emerging to help in this task. This paper extends our previous framework with more realistic instrument behaviour and more complex vascular models. The instrument is modelled as a hybrid mass–spring particle system while the vasculature is a triangulated surface mesh segmented from patient data sets. A specially designed commercial haptic device allows the trainee to use real instruments to guide the simulation through the vasculature selected from a database of 23 different patients. A new collision detection algorithm allows an efficient computation of the contacts, therefore leaving more time to deal with the collision response for a realistic simulation in real time. The behaviour of our simulated instruments has been visually compared with the real ones and assessed by experienced interventional radiologists. Preliminary results show close correlations and a realistic behaviour.
Fernando BelloEmail:
  相似文献   

14.
Guoray Cai 《GeoInformatica》2007,11(2):217-237
Human interactions with geographical information are contextualized by problem-solving activities which endow meaning to geospatial data and processing. However, existing spatial data models have not taken this aspect of semantics into account. This paper extends spatial data semantics to include not only the contents and schemas, but also the contexts of their use. We specify such a semantic model in terms of three related components: activity-centric context representation, contextualized ontology space, and context mediated semantic exchange. Contextualization of spatial data semantics allows the same underlying data to take multiple semantic forms, and disambiguate spatial concepts based on localized contexts. We demonstrate how such a semantic model supports contextualized interpretation of vague spatial concepts during human–GIS interactions. We employ conversational dialogue as the mechanism to perform collaborative diagnosis of context and to coordinate sharing of meaning across agents and data sources.
Guoray CaiEmail:
  相似文献   

15.
This paper describes the simulated car racing competition that was arranged as part of the 2007 IEEE Congress on Evolutionary Computation. Both the game that was used as the domain for the competition, the controllers submitted as entries to the competition and its results are presented. With this paper, we hope to provide some insight into the efficacy of various computational intelligence methods on a well-defined game task, as well as an example of one way of running a competition. In the process, we provide a set of reference results for those who wish to use the simplerace game to benchmark their own algorithms. The paper is co-authored by the organizers and participants of the competition.
Julian Togelius (Corresponding author)Email:
Simon LucasEmail:
Ho Duc ThangEmail:
Jonathan M. GaribaldiEmail:
Tomoharu NakashimaEmail:
Chin Hiong TanEmail:
Itamar ElhananyEmail:
Shay BerantEmail:
Philip HingstonEmail:
Robert M. MacCallumEmail:
Thomas HaferlachEmail:
Aravind GowrisankarEmail:
Pete BurrowEmail:
  相似文献   

16.
Backfitting of fuzzy rules is an Iterative Rule Learning technique for obtaining the knowledge base of a fuzzy rule-based system in regression problems. It consists in fitting one fuzzy rule to the data, and replacing the whole training set by the residual of the approximation. The obtained rule is added to the knowledge base, and the process is repeated until the residual is zero, or near zero. Such a design has been extended to imprecise data for which the observation error is small. Nevertheless, when this error is moderate or high, the learning can stop early. In this kind of algorithms, the specificity of the residual might decrease when a new rule is added. There may happen that the residual grows so wide that it covers the value zero for all points (thus the algorithm stops), but we have not yet extracted all the information available in the dataset. Focusing on this problem, this paper is about datasets with medium to high discrepancies between the observed and the actual values of the variables, such as those containing missing values and coarsely discretized data. We will show that the quality of the iterative learning degrades in this kind of problems, because it does not make full use of all the available information. As an alternative to sequentially obtaining rules, we propose a new multiobjective Genetic Cooperative Competitive Learning (GCCL) algorithm. In our approach, each individual in the population codifies one rule, which competes in the population in terms of maximum coverage and fitting, while the individuals in the population cooperate to form the knowledge base.
Luciano Sánchez (Corresponding author)Email:
José OteroEmail:
Inés CousoEmail:
  相似文献   

17.
Recognizing context for annotating a live life recording   总被引:2,自引:1,他引:1  
In the near future, it will be possible to continuously record and store the entire audio–visual lifetime of a person together with all digital information that the person perceives or creates. While the storage of this data will be possible soon, retrieval and indexing into such large data sets are unsolved challenges. Since today’s retrieval cues seem insufficient we argue that additional cues, obtained from body-worn sensors, make associative retrieval by humans possible. We present three approaches to create such cues, each along with an experimental evaluation: the user’s physical activity from acceleration sensors, his social environment from audio sensors, and his interruptibility from multiple sensors.
Albrecht SchmidtEmail:
  相似文献   

18.
Experiments in haptic-based authentication of humans   总被引:1,自引:1,他引:0  
With the rapid advancement of the technological revolution, computer technology such as faster processors, advanced graphic cards, and multi-media systems are becoming more affordable. Haptics technology is a force/tactile feedback technology growing in disciplines linked to human–computer interaction. Similar to the increasing complexity of silicon-based components, haptics technology is becoming more advanced. On the other hand, currently available commercial haptics interfaces are expensive, and their application is mostly dedicated to enormous research projects or systems. However, the trend of the market is forcing haptic developers to release products for use in conjunction with current keyboards and mice technologies. Haptics allows a user to touch, fell, manipulate, create, and/or alter simulated three-dimensional objects in a virtual environment. Most of the existing applications of haptics are dedicated to hone human physical skills such as sensitive hardware repair, medical procedures, handling hazardous substances, etc. These skills can be trained in a realistic virtual world, and describe human behavioural patterns in human–computer interaction environments. The measurement of such psychomotor patterns can be used to verify a person’s identity by assessing unique-to-the-individual behavioural attributes. This paper explores the unique behaviour exhibited by different users interacting with haptic systems. Through several haptic-based applications, users’ physical attributes output data from the haptic interface for use in the construction of a biometric system.
Abdulmotaleb El SaddikEmail:
  相似文献   

19.
The classical view of computing positions computation as a closed-box transformation of inputs (rational numbers or finite strings) to outputs. According to the interactive view of computing, computation is an ongoing interactive process rather than a function-based transformation of an input to an output. Specifically, communication with the outside world happens during the computation, not before or after it. This approach radically changes our understanding of what is computation and how it is modeled. The acceptance of interaction as a new paradigm is hindered by the Strong Church–Turing Thesis (SCT), the widespread belief that Turing Machines (TMs) capture all computation, so models of computation more expressive than TMs are impossible. In this paper, we show that SCT reinterprets the original Church–Turing Thesis (CTT) in a way that Turing never intended; its commonly assumed equivalence to the original is a myth. We identify and analyze the historical reasons for the widespread belief in SCT. Only by accepting that it is false can we begin to adopt interaction as an alternative paradigm of computation. We present Persistent Turing Machines (PTMs), that extend TMs to capture sequential interaction. PTMs allow us to formulate the Sequential Interaction Thesis, going beyond the expressiveness of TMs and of the CTT. The paradigm shift to interaction provides an alternative understanding of the nature of computing that better reflects the services provided by today’s computing technology.
Dina GoldinEmail:
  相似文献   

20.
Quantitative usability requirements are a critical but challenging, and hence an often neglected aspect of a usability engineering process. A case study is described where quantitative usability requirements played a key role in the development of a new user interface of a mobile phone. Within the practical constraints of the project, existing methods for determining usability requirements and evaluating the extent to which these are met, could not be applied as such, therefore tailored methods had to be developed. These methods and their applications are discussed.
Timo Jokela (Corresponding author)Email:
Jussi KoivumaaEmail:
Jani PirkolaEmail:
Petri SalminenEmail:
Niina KantolaEmail:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号