首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
ContextCritical systems in domains such as aviation, railway, and automotive are often subject to a formal process of safety certification. The goal of this process is to ensure that these systems will operate safely without posing undue risks to the user, the public, or the environment. Safety is typically ensured via complying with safety standards. Demonstrating compliance to these standards involves providing evidence to show that the safety criteria of the standards are met.ObjectiveIn order to cope with the complexity of large critical systems and subsequently the plethora of evidence information required for achieving compliance, safety professionals need in-depth knowledge to assist them in classifying different types of evidence, and in structuring and assessing the evidence. This paper is a step towards developing such a body of knowledge that is derived from a large-scale empirically rigorous literature review.MethodWe use a Systematic Literature Review (SLR) as the basis for our work. The SLR builds on 218 peer-reviewed studies, selected through a multi-stage process, from 4963 studies published between 1990 and 2012.ResultsWe develop a taxonomy that classifies the information and artefacts considered as evidence for safety. We review the existing techniques for safety evidence structuring and assessment, and further study the relevant challenges that have been the target of investigation in the academic literature. We analyse commonalities in the results among different application domains and discuss implications of the results for both research and practice.ConclusionThe paper is, to our knowledge, the largest existing study on the topic of safety evidence. The results are particularly relevant to practitioners seeking a better grasp on evidence requirements as well as to researchers in the area of system safety. As a major finding of the review, the results strongly suggest the need for more practitioner-oriented and industry-driven empirical studies in the area of safety certification.  相似文献   

2.
We apply activity theory (AT) to design adaptive e-learning systems (AeLS). AT is a framework to study human’s behavior at learning; whereas, AeLS enhance students’ apprenticeship by the personalization of teaching–learning experiences. AeLS depict users’ traits and predicts learning outcomes. The approach was successfully tested: Experimental group took lectures chosen by the anticipation AT principle; whilst, control group received randomly selected lectures. Learning achieved by experimental group reveals a correlation quite significant and high positive; but, for control group the correlation it is not significant and medium positive. We conclude: AT is a useful framework to design AeLS and provide student-centered education.  相似文献   

3.
Arm and wrist manipulanda are commonly used as input devices in teleoperation and gaming applications, establish a physical interface to patients in several rehabilitation robots, and are applied as advanced research tools in biomechanics and neuroscience. Despite the fact that the physical interface, i.e. the handle through which the wrist/hand is attached to the manipulator, may influence interaction and movement behavior, the effects of handle design on these parameters has received little attention. Yet, a poor handle design might lead to overexertion and altered movement dynamics, or result in misinterpretation of results in research studies. In this study, twelve healthy subjects performed repetitions of a wrist flexion task against a dynamic load generated by a 1-DOF robotic wrist manipulandum. Three different handle designs were qualitatively and quantitatively evaluated based on wrist movement kinematics and dynamics, patterns of finger and wrist muscle activity, and ergonomics criteria such as perceived comfort and fatigue. The three proposed designs were further compared to a conventional joystick-like handle. Task performance as well as kinematic and kinetic parameters were found to be unaffected by handle design. Nevertheless, differences were found in perceived task difficulty, comfort and levels of muscle activation of wrist and finger muscles, with significantly higher muscle activation when using a joystick-like design, where the handle is completely enclosed by the hand. Comfort was rated high for the flat handle, adapted to the natural curvature of the hand with the fingers extended. These results may inform for the design of handles serving as physical interface in teleoperation applications, robot-assisted rehabilitation and biomechanics/neuroscience research.  相似文献   

4.
The range and quality of freely available geo-referenced datasets is increasing. We evaluate the usefulness of free datasets for deforestation prediction by comparing generalised linear models and generalised linear mixed models (GLMMs) with a variety of machine learning models (Bayesian networks, artificial neural networks and Gaussian processes) across two study regions. Freely available datasets were able to generate plausible risk maps of deforestation using all techniques for study zones in both Mexico and Madagascar. Artificial neural networks outperformed GLMMs in the Madagascan (average AUC 0.83 vs 0.80), but not the Mexican study zone (average AUC 0.81 vs 0.89). In Mexico and Madagascar, Gaussian processes (average AUC 0.89, 0.85) and structured Bayesian networks (average AUC 0.88, 0.82) performed at least as well as GLMMs (average AUC 0.89, 0.80). Bayesian networks produced more stable results across different sampling methods. Gaussian processes performed well (average AUC 0.85) with fewer predictor variables.  相似文献   

5.
Stream flow prediction is studied by Artificial Intelligence (AI) in this paper using Artificial Neural Network (ANN) as a hybrid of Multi-Layer Perceptron (MLP) with the Levenberg–Marquardt (LM) backpropagation learning algorithm (MLP-LM) and (ii) MLP integrated with the Fire-Fly Algorithm (MLP-FFA). Monthly stream flow records used in this prediction problem comprise two stations at Bear River, the U.S.A., for the period of 1961–2012. Six different model structures are investigated for both MLP-LM and MLP-FFA models and their results were analysed using a number of performance measures including Correlation Coefficients (CC) and the Taylor diagram. The results indicate a significant improvement is likely in predicting downstream flows by MLP-FFA over that by MLP-LM, attributed to identifying the global minimum. In addition, an emerging multiple model (ensemble) strategy is employed to treat the outputs of the two MLP-LM and MLP-FFA models as inputs to an ANN model. The results show yet another further possible improvement. These two avenues for improvements identify possible directions towards next generation research activities.  相似文献   

6.
The presented work is part of a larger research program dealing with developing tools for coupling biogeochemical models in contaminated landscapes. The specific objective of this article is to provide researchers with a data porting tool to build hexagonal raster using information from a rectangular raster data (e.g. GIS format). This tool involves a computational algorithm and an open source software (written in C). The method of extending the reticulated functions defined on 2D networks is an essential key of this algorithm and can also be used for other purposes than data porting. The algorithm allows one to build the hexagonal raster with a cell size independent from the geometry of the rectangular raster. The extended function is a bi-cubic spline which can exactly reconstruct polynomials up to degree three in each variable. We validate the method by analyzing errors in some theoretical case studies followed by other studies with real terrain elevation data. We also introduce and briefly present an iterative water routing method and use it for validation on a case with concrete terrain data.  相似文献   

7.
This study focuses on designing an optimisation based control for sewer system in a methodological way and linking it to a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordingly, two novel optimisation configurations are developed, where the optimisation either acts on the actuators or acts on the regulatory control layer. These two optimisation designs are evaluated on a sub-catchment of the sewer system in Copenhagen, and found to perform better than the existing control; a rule based expert system. On the other hand, compared with a regulatory control technique designed earlier in Mollerup et al. (2015), the optimisation showed similar performance with respect to minimising overflow volume. Hence for operation of small sewer systems, regulatory control strategies can offer promising potential and should be considered along more advanced strategies when identifying novel solutions.  相似文献   

8.
The paper presents an event driven model predictive control (MPC) framework for managing charging operations of electric vehicles (EV) in a smart grid. The objective is to minimize the cost of energy consumption, while respecting EV drivers' preferences, technical bounds on the control action (in compliance with the IEC 61851 standard) and both market and grid constraints (by seeking the tracking of a reference load profile defined by the grid operator). The proposed control approach allows “flexible” EV users to participate in demand side management (DSM) programs, which will play a crucial role in improving stability and efficiency of future smart grids. Further, the natural MPC formulation of the problem can be recast into a mixed integer linear programming problem, suitable for implementation on a calculator. Simulation results are provided and discussed in detail.  相似文献   

9.
The timetabling problem of local Elderly Day Care Centers (EDCCs) is formulated into a weighted maximum constraint satisfaction problem (Max-CSP) in this study. The EDCC timetabling problem is a multi-dimensional assignment problem, where users (elderly) are required to perform activities that require different venues and timeslots, depending on operational constraints. These constraints are categorized into two: hard constraints, which must be fulfilled strictly, and soft constraints, which may be violated but with a penalty. Numerous methods have been successfully applied to the weighted Max-CSP; these methods include exact algorithms based on branch and bound techniques, and approximation methods based on repair heuristics, such as the min-conflict heuristic. This study aims to explore the potential of evolutionary algorithms by proposing a genetic-based discrete particle swarm optimization (GDPSO) to solve the EDCC timetabling problem. The proposed method is compared with the min-conflict random-walk algorithm (MCRW), Tabu search (TS), standard particle swarm optimization (SPSO), and a guided genetic algorithm (GGA). Computational evidence shows that GDPSO significantly outperforms the other algorithms in terms of solution quality and efficiency.  相似文献   

10.
The synthesis and destruction of proteins are imperative for maintaining their cellular homeostasis. In the 1970s, Aaron Ciechanover, Avram Hershko, and Irwin Rose discovered that certain proteins are tagged by ubiquitin before degradation, a discovery that awarded them the 2004 Nobel Prize in Chemistry. Compelling data gathered during the last several decades show that ubiquitin plays a vital role not only in protein degradation but also in many cellular functions including DNA repair processes, cell cycle regulation, cell growth, immune system functionality, hormone-mediated signaling in plants, vesicular trafficking pathways, regulation of histone modification and viral budding. Due to the involvement of ubiquitin in such a large number of diverse cellular processes, flaws and impairments in the ubiquitin system were found to be linked to cancer, neurodegenerative diseases, genetic disorders, and immunological disorders. Hence, deciphering the dynamics and complexity of the ubiquitin system is of significant importance. In addition to experimental techniques, computational methodologies have been gaining increasing influence in protein research and are used to uncover the structure, stability, folding, mechanism of action and interactions of proteins. Notably, molecular modeling and molecular dynamics simulations have become powerful tools that bridge the gap between structure and function while providing dynamic insights and illustrating essential mechanistic characteristics. In this study, we present an overview of molecular modeling and simulations of ubiquitin and the ubiquitin system, evaluate the status of the field, and offer our perspective on future progress in this area of research.  相似文献   

11.
With the great development of e-commerce, users can create and publish a wealth of product information through electronic communities. It is difficult, however, for manufacturers to discover the best reviews and to determine the true underlying quality of a product due to the sheer volume of reviews available for a single product. The goal of this paper is to develop models for predicting the helpfulness of reviews, providing a tool that finds the most helpful reviews of a given product. This study intends to propose HPNN (a helpfulness prediction model using a neural network), which uses a back-propagation multilayer perceptron neural network (BPN) model to predict the level of review helpfulness using the determinants of product data, the review characteristics, and the textual characteristics of reviews. The prediction accuracy of HPNN was better than that of a linear regression analysis in terms of the mean-squared error. HPNN can suggest better determinants which have a greater effect on the degree of helpfulness. The results of this study will identify helpful online reviews and will effectively assist in the design of review sites.  相似文献   

12.
This paper presents a safe design method for control-command embedded systems. It investigates the problem of building control-command systems out of Commercial off the shelf (COTS) components. The design method proposed uses in synergy the formal verification (FV) and the Discrete Controller Synthesis (DCS) techniques. COTS are formally specified using temporal logic and/or executable observers. New functions are built by assembling COTS together. As the COTS assembly operation is seldom error-free, behavioral incompatibilities may persist between COTS. For these reasons, COTS assemblies need to be formally verified and if errors are found, an automatic correction is attempted using DCS. The control-command code generated by DCS needs hardware specific post-processing: a structural decomposition, followed by a controllability assessment, followed by a dedicated formal verification step, ensuring that no spurious behavior is added by DCS. The resulting system is ready for hardware (e.g. FPGA) implementation.  相似文献   

13.
Land exchange through rental transactions is a central process in agricultural systems. The land tenure regimes emerge from land transactions and structural and land use changes are tied to the dynamics of the land market. We introduce LARMA, a LAnd Rental MArket model embedded within the Pampas Model (PM), an agent-based model of Argentinean agricultural systems. LARMA produces endogenous formation of land rental prices. LARMA relies on traditional economic concepts for LRP formation but addresses some drawbacks of this approach by being integrated into an agent-based model that considers heterogeneous agents interacting with one another. PM-LARMA successfully reproduced the agricultural land tenure regimes and land rental prices observed in the Pampas. Including adaptive, heterogeneous and interacting agents was critical to this success. We conclude that agent-based and traditional economic models can be successfully combined to capture complex emergent land tenure and market price patterns while simplifying the overall model design.  相似文献   

14.
The objective of this paper is to analyze the performance of singular value decomposition, expectation maximization, and Elman Neural Networks in optimization of code converter outputs in the classification of epilepsy risk levels from EEG (electroencephalogram) signals. The signal parameters such as the total number of positive and negative peaks, spikes and sharp waves, their duration etc., were extracted using morphological operators and wavelet transforms. Code converters were considered as a level one classifier. Code converters were found to have a performance index and quality value of 33.26 and 12.74, respectively, which is low. Consequently, for the EEG signals of 20 patients, the post classifiers were applied across 3 epochs of 16 channels. After having made a comparative study of different architectures, SVD was found to be the best post classifier as it marked a performance index of 89.48 and a quality value of 20.62. Elman neural network also exhibits good performance metrics than SVD in the morphological operator based feature extraction method.  相似文献   

15.
Vehicle control systems need to prognosticate future vehicle states in order to improve energy efficiency. This paper compares four approaches that are used to identify the parameters of a longitudinal vehicle dynamics model used for the prediction of vehicle tractive forces. All of the identification approaches build on a standard Kalman filter. Measurement signals are processed using the polynomial function approximation technique to remove noise and compute smooth derivative values of the signals. Experimental results illustrate that the approach using multiple Stenlund–Gustafsson M-Kalman filters (multiple robust and windup-stable Kalman filters) reaches the best performance and robustness in predicting the vehicle tractive forces.  相似文献   

16.
Experiential training simulators are gaining increasing popularity for job-related training due to their potential to engage and motivate adult learners. They are designed to provide learning experiences that are directly connected to users' work environments and support self-regulated learning. Nevertheless, learners often fail to transfer the knowledge gained in the simulated environment to real-world contexts. The EU-funded ImREAL project aimed to bridge that gap by developing a suite of intelligent services designed to enhance existing training simulators. This paper presents work that was a subset of this research project, reporting the iterative development and evaluation of a scaffolding service, which was integrated into a simulator for training medical students to perform diagnostic interviews. The study comprises three evaluation phases, comparing the pure simulator to a first version with metacognitive scaffolding and then to a final version with affective metacognitive scaffolding and enriched user modelling. The scaffolding service provides the learner with metacognitive prompts; affective elements are realized by an integrated affect reporting tool and affective prompts. Using a mixed-method approach by analysing questionnaires (N = 106) and log-data (N = 426), the effects of the services were investigated with respect to real-world relevance, self-regulated learning support, learning experience, and integration. Despite some limitations, the outcomes of this study demonstrate the usefulness of affective metacognitive scaffolding in the context of experiential training simulators; significant post-simulation increases in perceived relevance of the simulator, reflective note-taking, overall motivation, and feeling of success could be identified. Perceived usability and flow of the simulation increased, whereas overall workload and frustration decreased. However, low response rates to specific functions of the simulation point to a need to further investigate how to raise users' awareness and understanding of the provided tools, to encourage interaction with the services, and to better convey the benefits of using them. Thus, future challenges concern not so much technological developments for personalizing learning experiences, but rather new ways to change user attitudes towards an open approach to learning systems that enables them to benefit from all offered features.  相似文献   

17.
Environmental process modeling is challenged by the lack of high quality data, stochastic variations, and nonlinear behavior. Conventionally, parameter optimization is based on stochastic sampling techniques to deal with the nonlinear behavior of the proposed models. Despite widespread use, such tools cannot guarantee globally optimal parameter estimates. It can be especially difficult in practice to differentiate between lack of algorithm convergence, convergence to a non-global local optimum, and model structure deficits. For this reason, we use a deterministic global optimization algorithm for kinetic model identification and demonstrate it with a model describing a typical batch experiment. A combination of interval arithmetic, reformulations, and relaxations allows globally optimal identification of all (six) model parameters. In addition, the results suggest that further improvements may be obtained by modification of the optimization problem or by proof of the hypothesized pseudo-convex nature of the problem suggested by our results.  相似文献   

18.
Enhancing evacuee safety is a key factor in reducing the number of injuries and deaths that result from earthquakes. One way this can be achieved is by training occupants. Virtual Reality (VR) and Serious Games (SGs), represent novel techniques that may overcome the limitations of traditional training approaches. VR and SGs have been examined in the fire emergency context; however, their application to earthquake preparedness has not yet been extensively examined.We provide a theoretical discussion of the advantages and limitations of using VR SGs to investigate how building occupants behave during earthquake evacuations and to train building occupants to cope with such emergencies. We explore key design components for developing a VR SG framework: (a) what features constitute an earthquake event; (b) which building types can be selected and represented within the VR environment; (c) how damage to the building can be determined and represented; (d) how non-player characters (NPC) can be designed; and (e) what level of interaction there can be between NPC and the human participants. We illustrate the above by presenting the Auckland City Hospital, New Zealand as a case study, and propose a possible VR SG training tool to enhance earthquake preparedness in public buildings.  相似文献   

19.
The European eel Regulation EC 1100/2007 establishes measures to recover the European eel stock. The Regulation requires Member States to guarantee a spawner escapement ≥40% of pristine levels by reducing eel mortality. The complexity and plasticity of eel life history make it difficult to assess the effectiveness of alternative management options, and tools allowing decision makers and fishermen to quickly assess the effectiveness of proposed management scenarios are urgently needed. We used state-of-the-art knowledge to develop a user-friendly simulation software allowing users to evaluate if current management policies meet the conservation target and evaluate the expected performances (spawner escapement and fishing yield) of alternative management scenarios. The software relies upon a demographic model explicitly accounting for the most relevant features of eel demography and has default settings for specific geographical areas and water systems. We demonstrate the software by exploring a variety of management plans in three European water systems.  相似文献   

20.
《Parallel Computing》2014,40(10):611-627
Work-stealing and work-sharing are two basic paradigms for dynamic task scheduling. This paper introduces an adaptive and hierarchical task scheduling scheme (AHS) for multi-core clusters, in which work-stealing and work-sharing are adaptively used to achieve load balancing.Work-stealing has been widely used in task-based parallel programing languages and models, especially on shared memory systems. However, high inter-node communication costs hinder work-stealing from being directly performed on distributed memory systems. AHS addresses this issue with the following techniques: (1) initial partitioning, which reduces the inter-node task migrations; (2) hierarchical scheduling scheme, which performs work-stealing inside a node before going across the node boundary and adopts work-sharing to overlap computation and communication at the inter-node level; and (3) hierarchical and centralized control for inter-node task migration, which improves the efficiency of victim selection and termination detection.We evaluated AHS and existing work-stealing schemes on a 16-nodes multi-core cluster. Experimental results show that AHS outperforms existing schemes by 11–21.4%, for the benchmarks studied in this paper.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号