共查询到20条相似文献,搜索用时 11 毫秒
1.
Many students who participate in online courses experience frustration and failure because they are not prepared for the demanding and isolated learning experience. A traditional learning theory known as self-directed learning (SDL) is a foundation that can help establish features of a personalized system that helps students improve their abilities to manage their overall learning activities and monitor their own performance. Additionally, the system enables collaboration, interaction, feedback, and the much-needed support from the instructor and students' peers. A Web 2.0 social-technology application, MediaWiki, was adopted as the platform from which incremental features were developed to utilize the fundamental concepts of SDL. Students were able to customize content by setting specific learning goals, reflecting on their learning experiences, self-monitoring activities and performances, and collaborating with others in the class. SDL skills exist to some degree in all learners, this study finds that students' SDL abilities can improve when a course adopts a personalized and collaborative learning system that enables the students to be more proactive in planning, organizing, and monitoring their course activities. 相似文献
2.
《Information and Software Technology》2014,56(7):689-717
ContextCritical systems in domains such as aviation, railway, and automotive are often subject to a formal process of safety certification. The goal of this process is to ensure that these systems will operate safely without posing undue risks to the user, the public, or the environment. Safety is typically ensured via complying with safety standards. Demonstrating compliance to these standards involves providing evidence to show that the safety criteria of the standards are met.ObjectiveIn order to cope with the complexity of large critical systems and subsequently the plethora of evidence information required for achieving compliance, safety professionals need in-depth knowledge to assist them in classifying different types of evidence, and in structuring and assessing the evidence. This paper is a step towards developing such a body of knowledge that is derived from a large-scale empirically rigorous literature review.MethodWe use a Systematic Literature Review (SLR) as the basis for our work. The SLR builds on 218 peer-reviewed studies, selected through a multi-stage process, from 4963 studies published between 1990 and 2012.ResultsWe develop a taxonomy that classifies the information and artefacts considered as evidence for safety. We review the existing techniques for safety evidence structuring and assessment, and further study the relevant challenges that have been the target of investigation in the academic literature. We analyse commonalities in the results among different application domains and discuss implications of the results for both research and practice.ConclusionThe paper is, to our knowledge, the largest existing study on the topic of safety evidence. The results are particularly relevant to practitioners seeking a better grasp on evidence requirements as well as to researchers in the area of system safety. As a major finding of the review, the results strongly suggest the need for more practitioner-oriented and industry-driven empirical studies in the area of safety certification. 相似文献
3.
We apply activity theory (AT) to design adaptive e-learning systems (AeLS). AT is a framework to study human’s behavior at learning; whereas, AeLS enhance students’ apprenticeship by the personalization of teaching–learning experiences. AeLS depict users’ traits and predicts learning outcomes. The approach was successfully tested: Experimental group took lectures chosen by the anticipation AT principle; whilst, control group received randomly selected lectures. Learning achieved by experimental group reveals a correlation quite significant and high positive; but, for control group the correlation it is not significant and medium positive. We conclude: AT is a useful framework to design AeLS and provide student-centered education. 相似文献
4.
《Expert systems with applications》2014,41(8):3922-3934
Robotic devices are becoming a popular alternative to the traditional physical therapy as a mean to enhance functional recovery after stroke; they offer more intensive practice opportunities without increasing time spent on supervision by the treating therapist. An ideal behavior for these systems would consist in emulating real therapists by providing anticipated force feedback to the patients in order to encourage and modulate neural plasticity. However, nowadays there are no systems able to work in an anticipatory fashion. For this reason, the authors propose an anticipatory assistance-as-needed control algorithm for a multijoint robotic orthosis to be used in physical ABI neurorehabilitation. This control algorithm, based on a dysfunctional-adapted biomechanical prediction subsystem, is able to avoid patient trajectory deviations by providing them with anticipatory force-feedback. The system has been validated by means of a robotic simulator.Obtained results demonstrate through simulations that the proposed assistance-as-needed control algorithm is able to provide anticipatory actuation to the patients, avoiding trajectory deviations and tending to minimize the degree of actuation. Thus, the main novelty and contribution of this work is the anticipatory nature of the proposed assistance-as-needed control algorithm, that breaks with the current robotic control strategies by not waiting for the trajectory deviations to take place. This new actuation paradigm avoids patient slacking and increases both participation and muscle activity in such a way that neural plasticity is encouraged and modulated to reinforce motor recovery. 相似文献
5.
Arm and wrist manipulanda are commonly used as input devices in teleoperation and gaming applications, establish a physical interface to patients in several rehabilitation robots, and are applied as advanced research tools in biomechanics and neuroscience. Despite the fact that the physical interface, i.e. the handle through which the wrist/hand is attached to the manipulator, may influence interaction and movement behavior, the effects of handle design on these parameters has received little attention. Yet, a poor handle design might lead to overexertion and altered movement dynamics, or result in misinterpretation of results in research studies. In this study, twelve healthy subjects performed repetitions of a wrist flexion task against a dynamic load generated by a 1-DOF robotic wrist manipulandum. Three different handle designs were qualitatively and quantitatively evaluated based on wrist movement kinematics and dynamics, patterns of finger and wrist muscle activity, and ergonomics criteria such as perceived comfort and fatigue. The three proposed designs were further compared to a conventional joystick-like handle. Task performance as well as kinematic and kinetic parameters were found to be unaffected by handle design. Nevertheless, differences were found in perceived task difficulty, comfort and levels of muscle activation of wrist and finger muscles, with significantly higher muscle activation when using a joystick-like design, where the handle is completely enclosed by the hand. Comfort was rated high for the flat handle, adapted to the natural curvature of the hand with the fingers extended. These results may inform for the design of handles serving as physical interface in teleoperation applications, robot-assisted rehabilitation and biomechanics/neuroscience research. 相似文献
6.
This work studies the relation between computer use for reading activities and academic literacy in 15-year-old students in Chile, Uruguay, Spain, and Portugal. Data used is from the PISA 2009 test. Special attention is given to potential bias problems when the computer use is an endogenous variable. Few studies in this area address this issue: existing literature has shown that different types of computer use have different implications on performance. The limitations of observational data have also been emphasized to establish cause–effect relations between computer use and academic performance. It is important, however, to consider the computer use endogeneity hypothesis (above all at home) since students decide on the frequency of computer use at home. The results found show that by controlling for endogeneity, computer use for reading is not related to reading performance neither in digital or printed format, with the exception of Chile that shows a negative relation in the case of reading from a printed format. The results considering endogeneity differ considerably from results when endogeneity is not taken into account. The work shows the relevance of experimental type studies in order to make sound statements with regard to the computer use and academic performance relation. In turn, school reading activities in a digital environment are suggested that could have an impact on reading performance. 相似文献
7.
The synthesis and destruction of proteins are imperative for maintaining their cellular homeostasis. In the 1970s, Aaron Ciechanover, Avram Hershko, and Irwin Rose discovered that certain proteins are tagged by ubiquitin before degradation, a discovery that awarded them the 2004 Nobel Prize in Chemistry. Compelling data gathered during the last several decades show that ubiquitin plays a vital role not only in protein degradation but also in many cellular functions including DNA repair processes, cell cycle regulation, cell growth, immune system functionality, hormone-mediated signaling in plants, vesicular trafficking pathways, regulation of histone modification and viral budding. Due to the involvement of ubiquitin in such a large number of diverse cellular processes, flaws and impairments in the ubiquitin system were found to be linked to cancer, neurodegenerative diseases, genetic disorders, and immunological disorders. Hence, deciphering the dynamics and complexity of the ubiquitin system is of significant importance. In addition to experimental techniques, computational methodologies have been gaining increasing influence in protein research and are used to uncover the structure, stability, folding, mechanism of action and interactions of proteins. Notably, molecular modeling and molecular dynamics simulations have become powerful tools that bridge the gap between structure and function while providing dynamic insights and illustrating essential mechanistic characteristics. In this study, we present an overview of molecular modeling and simulations of ubiquitin and the ubiquitin system, evaluate the status of the field, and offer our perspective on future progress in this area of research. 相似文献
8.
Stream flow prediction is studied by Artificial Intelligence (AI) in this paper using Artificial Neural Network (ANN) as a hybrid of Multi-Layer Perceptron (MLP) with the Levenberg–Marquardt (LM) backpropagation learning algorithm (MLP-LM) and (ii) MLP integrated with the Fire-Fly Algorithm (MLP-FFA). Monthly stream flow records used in this prediction problem comprise two stations at Bear River, the U.S.A., for the period of 1961–2012. Six different model structures are investigated for both MLP-LM and MLP-FFA models and their results were analysed using a number of performance measures including Correlation Coefficients (CC) and the Taylor diagram. The results indicate a significant improvement is likely in predicting downstream flows by MLP-FFA over that by MLP-LM, attributed to identifying the global minimum. In addition, an emerging multiple model (ensemble) strategy is employed to treat the outputs of the two MLP-LM and MLP-FFA models as inputs to an ANN model. The results show yet another further possible improvement. These two avenues for improvements identify possible directions towards next generation research activities. 相似文献
9.
The paper presents an event driven model predictive control (MPC) framework for managing charging operations of electric vehicles (EV) in a smart grid. The objective is to minimize the cost of energy consumption, while respecting EV drivers' preferences, technical bounds on the control action (in compliance with the IEC 61851 standard) and both market and grid constraints (by seeking the tracking of a reference load profile defined by the grid operator). The proposed control approach allows “flexible” EV users to participate in demand side management (DSM) programs, which will play a crucial role in improving stability and efficiency of future smart grids. Further, the natural MPC formulation of the problem can be recast into a mixed integer linear programming problem, suitable for implementation on a calculator. Simulation results are provided and discussed in detail. 相似文献
10.
Currently, fully automated as-built modeling of building interiors using point-cloud data still remains an open challenge, due to several problems that repeatedly arise: (1) complex indoor environments containing multiple rooms; (2) time-consuming and labor-intensive noise filtering; (3) difficulties of representation of volumetric and detail-rich objects such as windows and doors. This study aimed to overcome such limitations while improving the amount of details reproduced within the model for further utilization in BIM. First, we input just the registered three-dimensional (3D) point-cloud data and segmented the point cloud into separate rooms for more effective performance of the later modeling phases for each room. For noise filtering, an offset space from the ceiling height was used to determine whether the scan points belonged to clutter or architectural components. The filtered points were projected onto a binary map in order to trace the floor-wall boundary, which was further refined through subsequent segmentation and regularization procedures. Then, the wall volumes were estimated in two ways: inside- and outside-wall-component modeling. Finally, the wall points were segmented and projected onto an inverse binary map, thereby enabling detection and modeling of the hollow areas as windows or doors. The experimental results on two real-world data sets demonstrated, through comparison with manually-generated models, the effectiveness of our approach: the calculated RMSEs of the two resulting models were 0.089 m and 0.074 m, respectively. 相似文献
11.
Enhancing evacuee safety is a key factor in reducing the number of injuries and deaths that result from earthquakes. One way this can be achieved is by training occupants. Virtual Reality (VR) and Serious Games (SGs), represent novel techniques that may overcome the limitations of traditional training approaches. VR and SGs have been examined in the fire emergency context; however, their application to earthquake preparedness has not yet been extensively examined.We provide a theoretical discussion of the advantages and limitations of using VR SGs to investigate how building occupants behave during earthquake evacuations and to train building occupants to cope with such emergencies. We explore key design components for developing a VR SG framework: (a) what features constitute an earthquake event; (b) which building types can be selected and represented within the VR environment; (c) how damage to the building can be determined and represented; (d) how non-player characters (NPC) can be designed; and (e) what level of interaction there can be between NPC and the human participants. We illustrate the above by presenting the Auckland City Hospital, New Zealand as a case study, and propose a possible VR SG training tool to enhance earthquake preparedness in public buildings. 相似文献
12.
The range and quality of freely available geo-referenced datasets is increasing. We evaluate the usefulness of free datasets for deforestation prediction by comparing generalised linear models and generalised linear mixed models (GLMMs) with a variety of machine learning models (Bayesian networks, artificial neural networks and Gaussian processes) across two study regions. Freely available datasets were able to generate plausible risk maps of deforestation using all techniques for study zones in both Mexico and Madagascar. Artificial neural networks outperformed GLMMs in the Madagascan (average AUC 0.83 vs 0.80), but not the Mexican study zone (average AUC 0.81 vs 0.89). In Mexico and Madagascar, Gaussian processes (average AUC 0.89, 0.85) and structured Bayesian networks (average AUC 0.88, 0.82) performed at least as well as GLMMs (average AUC 0.89, 0.80). Bayesian networks produced more stable results across different sampling methods. Gaussian processes performed well (average AUC 0.85) with fewer predictor variables. 相似文献
13.
14.
《Expert systems with applications》2014,41(6):3041-3046
With the great development of e-commerce, users can create and publish a wealth of product information through electronic communities. It is difficult, however, for manufacturers to discover the best reviews and to determine the true underlying quality of a product due to the sheer volume of reviews available for a single product. The goal of this paper is to develop models for predicting the helpfulness of reviews, providing a tool that finds the most helpful reviews of a given product. This study intends to propose HPNN (a helpfulness prediction model using a neural network), which uses a back-propagation multilayer perceptron neural network (BPN) model to predict the level of review helpfulness using the determinants of product data, the review characteristics, and the textual characteristics of reviews. The prediction accuracy of HPNN was better than that of a linear regression analysis in terms of the mean-squared error. HPNN can suggest better determinants which have a greater effect on the degree of helpfulness. The results of this study will identify helpful online reviews and will effectively assist in the design of review sites. 相似文献
15.
Trust in the cloud environment is not written into an agreement and is something earned. In any trust evaluation mechanism, opinion leaders are the entities influencing the behaviors or attitudes of others, this makes them to be trustworthy, valid among other characteristics. On the other hand, trolls are the entities posting incorrect and unreal comments; therefore, their effect must be removed. This paper evaluates the trust by considering the influence of opinion leaders on other entities and removing the troll entities’ effect in the cloud environment. Trust value is evaluated using five parameters; availability, reliability, data integrity, identity and capability. Also, we propose a method for opinion leaders and troll entity identification using three topological metrics, including input-degree, output-degree and reputation measures. The method being evaluated in various situation where shows the results of accuracy by removing the effect of troll entities and the advice of opinion leaders. 相似文献
16.
This paper presents the formulation and performance analysis of four techniques for detection of a narrowband acoustic source in a shallow range-independent ocean using an acoustic vector sensor (AVS) array. The array signal vector is not known due to the unknown location of the source. Hence all detectors are based on a generalized likelihood ratio test (GLRT) which involves estimation of the array signal vector. One non-parametric and three parametric (model-based) signal estimators are presented. It is shown that there is a strong correlation between the detector performance and the mean-square signal estimation error. Theoretical expressions for probability of false alarm and probability of detection are derived for all the detectors, and the theoretical predictions are compared with simulation results. It is shown that the detection performance of an AVS array with a certain number of sensors is equal to or slightly better than that of a conventional acoustic pressure sensor array with thrice as many sensors. 相似文献
17.
Military helicopter pilots are expected to wear a variety of items of body-borne equipment during flight so as to be prepared for any situation that may arise in combat. Helicopter seats are designed to a specified weight range for an occupant with equipment. This paper investigates how distributing the equipment on the body affects injury potential during a helicopter crash. A finite element model representing a helicopter seat with a fully deformable 50th percentile Hybrid III carrying equipment was developed. The model was subjected to a standard military certification crash test. Various equipment configurations were investigated and analysed to determine its influence on the risk of injury. It was found that placing the equipment low on the torso, i.e. near the thighs, not only reduces the likelihood of injury in the lumbar, spinal region but also provides favourable results in neck and head injury risk when compared to other configurations investigated. In contrast, placing equipment high on the torso, i.e. close to the chin, increases the lumbar load and implicitly, the risk of head injury. A statistical analysis is carried out using the Wilcoxon Signed Rank Test to deliver probability of loads experienced within a certain interval. This study recommends an equipment configuration that improves survivability for an occupant seated on a fixed load energy absorbing seat which is subjected to Military Standard 58095A Test 4. 相似文献
18.
Land exchange through rental transactions is a central process in agricultural systems. The land tenure regimes emerge from land transactions and structural and land use changes are tied to the dynamics of the land market. We introduce LARMA, a LAnd Rental MArket model embedded within the Pampas Model (PM), an agent-based model of Argentinean agricultural systems. LARMA produces endogenous formation of land rental prices. LARMA relies on traditional economic concepts for LRP formation but addresses some drawbacks of this approach by being integrated into an agent-based model that considers heterogeneous agents interacting with one another. PM-LARMA successfully reproduced the agricultural land tenure regimes and land rental prices observed in the Pampas. Including adaptive, heterogeneous and interacting agents was critical to this success. We conclude that agent-based and traditional economic models can be successfully combined to capture complex emergent land tenure and market price patterns while simplifying the overall model design. 相似文献
19.
The timetabling problem of local Elderly Day Care Centers (EDCCs) is formulated into a weighted maximum constraint satisfaction problem (Max-CSP) in this study. The EDCC timetabling problem is a multi-dimensional assignment problem, where users (elderly) are required to perform activities that require different venues and timeslots, depending on operational constraints. These constraints are categorized into two: hard constraints, which must be fulfilled strictly, and soft constraints, which may be violated but with a penalty. Numerous methods have been successfully applied to the weighted Max-CSP; these methods include exact algorithms based on branch and bound techniques, and approximation methods based on repair heuristics, such as the min-conflict heuristic. This study aims to explore the potential of evolutionary algorithms by proposing a genetic-based discrete particle swarm optimization (GDPSO) to solve the EDCC timetabling problem. The proposed method is compared with the min-conflict random-walk algorithm (MCRW), Tabu search (TS), standard particle swarm optimization (SPSO), and a guided genetic algorithm (GGA). Computational evidence shows that GDPSO significantly outperforms the other algorithms in terms of solution quality and efficiency. 相似文献
20.
Recent European Directives promoted the development of biofuels, requesting mandatory limits to their emissions ot greenhouse gases (GHG). Second-generation biofuels based on lignocellulosic biomass are prime candidates but their GHG emissions are variable and uncertain. Agro-ecosystem modeling can capture them and the performance of biofuel feedstocks.This study aimed at optimizing feedstock supply for a bioethanol unit in France, from agricultural residues, annual and perennial crops. Their productivity and environmental impacts were modelled on a regional scale using geo-referenced data on soil properties, crop management, land-use and future weather data. Several supply scenarios were tested. Cereal straw was the most efficient feedstock but had a low availability, and only miscanthus could meet the bioethanol plant's demand. Sorghum combined poor yields and high GHG emissions compared by miscanthus and triticale. A mix of three biomass sources used less than 3% of the regional agricultural land while abating GHG emissions by 60%. 相似文献