首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Land exchange through rental transactions is a central process in agricultural systems. The land tenure regimes emerge from land transactions and structural and land use changes are tied to the dynamics of the land market. We introduce LARMA, a LAnd Rental MArket model embedded within the Pampas Model (PM), an agent-based model of Argentinean agricultural systems. LARMA produces endogenous formation of land rental prices. LARMA relies on traditional economic concepts for LRP formation but addresses some drawbacks of this approach by being integrated into an agent-based model that considers heterogeneous agents interacting with one another. PM-LARMA successfully reproduced the agricultural land tenure regimes and land rental prices observed in the Pampas. Including adaptive, heterogeneous and interacting agents was critical to this success. We conclude that agent-based and traditional economic models can be successfully combined to capture complex emergent land tenure and market price patterns while simplifying the overall model design.  相似文献   

2.
In this study we expanded a recently developed approach for defining acceptable levels of management policy that will allow sustainable management of water quality in a lake ecosystem. A three dimensional solution space was created to define all acceptable scenarios of N loads, P loads and lake water level (WL) thus providing an integrated tool for defining the extent of measures that will allow lake ecosystem sustainability. The approach included use of a lake ecosystem model, a quantitative system of composite water quality indices (CWQIs) and defined sustainability criteria for the ecosystem. The approach was tested on the Lake Kinneret (Sea of Galilee) ecosystem and succeeded in defining the range of acceptable management policy through the use of long term simulations of different scenarios. Using the results of the scenarios, a number of polygons were created, defined as relative solution domain area (RSDA), which denote the permissible ranges of nutrient loads at different water levels. The polygon, and hence RSDA, boundaries represent critical values of nutrient loads allowing conservation of the lake water quality at each WL. By integrating all RSDA, a three dimensional solution space was created which defines all acceptable ranges of N loads, P loads and WL thus providing lake managers with an integrated tool for defining the extent of measures that will allow sustainability of the lake ecosystem. This novel approach is unique, and presents an example of implementation of a management tool that integrates an ecosystem model, multiple stressors and quantified water quality indices to determine limits of management actions. This approach may well be implemented to other lakes around the world suffering from water quality deterioration as a result of changes in water level and nutrients loads.  相似文献   

3.
The flood disaster management system (FDMS) is a platform developed to provide ongoing disaster reduction capabilities that cover the entire process of flood management. The ontology-based approach links environmental models with disaster-related data to support FDMS-constructed workflows with suitable models and recommend appropriate datasets as model input automatically. This automated activity for model selection and data binding reduces the time-consuming and unreliable operations involved in traditional management techniques, which rely on manual retrieval through simple metadata indices—typically when flood management personnel are overwhelmed with large quantities of observed data. The OpenMI-based modular design used in the system unifies interfaces and data exchange to provide flexible and extensible architecture. Subsequent 3D visualization improves the interpretability of disaster data and the effectiveness of decision-making processes. This paper presents an overview of the design and capabilities of FDMS that provides one-stop management for flood disasters.  相似文献   

4.
5.
This study focuses on designing an optimisation based control for sewer system in a methodological way and linking it to a regulatory control. Optimisation based design is found to depend on proper choice of a model, formulation of objective function and tuning of optimisation parameters. Accordingly, two novel optimisation configurations are developed, where the optimisation either acts on the actuators or acts on the regulatory control layer. These two optimisation designs are evaluated on a sub-catchment of the sewer system in Copenhagen, and found to perform better than the existing control; a rule based expert system. On the other hand, compared with a regulatory control technique designed earlier in Mollerup et al. (2015), the optimisation showed similar performance with respect to minimising overflow volume. Hence for operation of small sewer systems, regulatory control strategies can offer promising potential and should be considered along more advanced strategies when identifying novel solutions.  相似文献   

6.
Research has shown that students tend to engage in quality learning when they are asked to teach (i.e., learning-by-teaching). In this study, a web-based tutoring environment has been developed to enhance the learning of college students by their teaching others: the Virtual Tutee System (VTS). In the VTS, students take the role of tutor and teach a virtual character about what they learn from readings. The design of the VTS has been refined through several iterations of formative evaluation. The current study explored whether the recent improvements made in the VTS augmented the learning-by-teaching effects. The VTS was evaluated with regard to its' effects on students' reading engagement and reading performance. Results indicated that students were behaviorally and cognitively engaged in reading with use of the VTS. Also, the study found a significant improvement in students' emotional engagement in reading after using the VTS. Limited support was found for enhancement of reading performance with repeated use of the VTS. Implications of the study findings are discussed, and suggestions for future research are provided.  相似文献   

7.
Selection of strategies that help reduce riverine inputs requires numerical models that accurately quantify hydrologic processes. While numerous models exist, information on how to evaluate and select the most robust models is limited. Toward this end, we developed a comprehensive approach that helps evaluate watershed models in their ability to simulate flow regimes critical to downstream ecosystem services. We demonstrated the method using the Soil and Water Assessment Tool (SWAT), the Hydrological Simulation Program–FORTRAN (HSPF) model, and Distributed Large Basin Runoff Model (DLBRM) applied to the Maumee River Basin (USA). The approach helped in identifying that each model simulated flows within acceptable ranges. However, each was limited in its ability to simulate flows triggered by extreme weather events, owing to algorithms not being optimized for such events and mismatched physiographic watershed conditions. Ultimately, we found HSPF to best predict river flow, whereas SWAT offered the most flexibility for evaluating agricultural management practices.  相似文献   

8.
Experiential training simulators are gaining increasing popularity for job-related training due to their potential to engage and motivate adult learners. They are designed to provide learning experiences that are directly connected to users' work environments and support self-regulated learning. Nevertheless, learners often fail to transfer the knowledge gained in the simulated environment to real-world contexts. The EU-funded ImREAL project aimed to bridge that gap by developing a suite of intelligent services designed to enhance existing training simulators. This paper presents work that was a subset of this research project, reporting the iterative development and evaluation of a scaffolding service, which was integrated into a simulator for training medical students to perform diagnostic interviews. The study comprises three evaluation phases, comparing the pure simulator to a first version with metacognitive scaffolding and then to a final version with affective metacognitive scaffolding and enriched user modelling. The scaffolding service provides the learner with metacognitive prompts; affective elements are realized by an integrated affect reporting tool and affective prompts. Using a mixed-method approach by analysing questionnaires (N = 106) and log-data (N = 426), the effects of the services were investigated with respect to real-world relevance, self-regulated learning support, learning experience, and integration. Despite some limitations, the outcomes of this study demonstrate the usefulness of affective metacognitive scaffolding in the context of experiential training simulators; significant post-simulation increases in perceived relevance of the simulator, reflective note-taking, overall motivation, and feeling of success could be identified. Perceived usability and flow of the simulation increased, whereas overall workload and frustration decreased. However, low response rates to specific functions of the simulation point to a need to further investigate how to raise users' awareness and understanding of the provided tools, to encourage interaction with the services, and to better convey the benefits of using them. Thus, future challenges concern not so much technological developments for personalizing learning experiences, but rather new ways to change user attitudes towards an open approach to learning systems that enables them to benefit from all offered features.  相似文献   

9.
Agent-based modeling (ABM) is an established technique to capture human-environment interactions in socio-ecological systems. As a micro-model, it explicitly represents each agent, such that heterogeneous decision-making processes (e.g. based on the beliefs and experiences of stakeholders) can anticipate the socio-environmental consequences of aggregated individual behaviors. In contrast to ABM, Fuzzy Cognitive Mapping takes a macro-level view of the world that represents causal connections between concepts rather than individual entities. Researchers have expressed interest in reconciling the two, i.e. taking a hybrid approach and drawing of the strengths of each to more accurately model socio-ecological interactions. The intuition is to take FCMs, which can be quickly developed using participatory modeling tools and use them to create a virtual population of agents with sophisticated decision-making processes. In this paper, we detail two ways in which this combination can be done, and highlight the key questions that modelers need to be mindful of.  相似文献   

10.
The purpose of this study was to reveal barriers encountered by Turkish primary school teachers in the integration of ICT, to propose potential enablers to overcome those barriers, and to compare the current status of ICT integration (in 2011) with the status of ICT integration in 2005. Part of the data for this comparison was gathered in 2005 as part of a doctoral study by Goktas (2006). A survey design was used to investigate the barriers and enablers. Data were collected from 1373 teachers from 52 schools in 39 provinces. The results indicate that ‘lack of hardware’, ‘lack of appropriate software materials’, ‘limitations of hardware’, ‘lack of in-service training’, and ‘lack of technical support’ were the most important barriers. The highest ranked enablers were ‘allocation of more budget’, ‘allocation of specific units for peer support’, ‘allocation of support offices and personnel for teachers’, and ‘offering higher quality pre-service training for ICT’. Other leading enablers were ‘supporting teachers to enable effective ICT use’, ‘having technology plans’, ‘offering higher quality and more quantity of in-service training’, and ‘designing appropriate course content/instructional programs’. Analysis of an independent t-test revealed that most barriers showed significant differences and most enablers showed moderate or low differences between teachers' perceptions of their situation in 2005 and in 2011.  相似文献   

11.
This paper proposes an approach for Inertial Measurement Unit sensor fault reconstruction by exploiting a ground speed-based kinematic model of the aircraft flying in a rotating earth reference system. Two strategies for the validation of sensor fault reconstruction are presented: closed-loop validation and open-loop validation. Both strategies use the same kinematic model and a newly-developed Adaptive Two-Stage Extended Kalman Filter to estimate the states and faults of the aircraft. Simulation results demonstrate the effectiveness of the proposed approach compared to an approach using an airspeed-based kinematic model. Furthermore, the major contribution is that the proposed approach is validated using real flight test data including the presence of external disturbances such as turbulence. Three flight scenarios are selected to test the performance of the proposed approach. It is shown that the proposed approach is robust to model uncertainties, unmodeled dynamics and disturbances such as time-varying wind and turbulence. Therefore, the proposed approach can be incorporated into aircraft Fault Detection and Isolation systems to enhance the performance of the aircraft.  相似文献   

12.
The paper presents an event driven model predictive control (MPC) framework for managing charging operations of electric vehicles (EV) in a smart grid. The objective is to minimize the cost of energy consumption, while respecting EV drivers' preferences, technical bounds on the control action (in compliance with the IEC 61851 standard) and both market and grid constraints (by seeking the tracking of a reference load profile defined by the grid operator). The proposed control approach allows “flexible” EV users to participate in demand side management (DSM) programs, which will play a crucial role in improving stability and efficiency of future smart grids. Further, the natural MPC formulation of the problem can be recast into a mixed integer linear programming problem, suitable for implementation on a calculator. Simulation results are provided and discussed in detail.  相似文献   

13.
Military helicopter pilots are expected to wear a variety of items of body-borne equipment during flight so as to be prepared for any situation that may arise in combat. Helicopter seats are designed to a specified weight range for an occupant with equipment. This paper investigates how distributing the equipment on the body affects injury potential during a helicopter crash. A finite element model representing a helicopter seat with a fully deformable 50th percentile Hybrid III carrying equipment was developed. The model was subjected to a standard military certification crash test. Various equipment configurations were investigated and analysed to determine its influence on the risk of injury. It was found that placing the equipment low on the torso, i.e. near the thighs, not only reduces the likelihood of injury in the lumbar, spinal region but also provides favourable results in neck and head injury risk when compared to other configurations investigated. In contrast, placing equipment high on the torso, i.e. close to the chin, increases the lumbar load and implicitly, the risk of head injury. A statistical analysis is carried out using the Wilcoxon Signed Rank Test to deliver probability of loads experienced within a certain interval. This study recommends an equipment configuration that improves survivability for an occupant seated on a fixed load energy absorbing seat which is subjected to Military Standard 58095A Test 4.  相似文献   

14.
This work studies the relation between computer use for reading activities and academic literacy in 15-year-old students in Chile, Uruguay, Spain, and Portugal. Data used is from the PISA 2009 test. Special attention is given to potential bias problems when the computer use is an endogenous variable. Few studies in this area address this issue: existing literature has shown that different types of computer use have different implications on performance. The limitations of observational data have also been emphasized to establish cause–effect relations between computer use and academic performance. It is important, however, to consider the computer use endogeneity hypothesis (above all at home) since students decide on the frequency of computer use at home. The results found show that by controlling for endogeneity, computer use for reading is not related to reading performance neither in digital or printed format, with the exception of Chile that shows a negative relation in the case of reading from a printed format. The results considering endogeneity differ considerably from results when endogeneity is not taken into account. The work shows the relevance of experimental type studies in order to make sound statements with regard to the computer use and academic performance relation. In turn, school reading activities in a digital environment are suggested that could have an impact on reading performance.  相似文献   

15.
The range and quality of freely available geo-referenced datasets is increasing. We evaluate the usefulness of free datasets for deforestation prediction by comparing generalised linear models and generalised linear mixed models (GLMMs) with a variety of machine learning models (Bayesian networks, artificial neural networks and Gaussian processes) across two study regions. Freely available datasets were able to generate plausible risk maps of deforestation using all techniques for study zones in both Mexico and Madagascar. Artificial neural networks outperformed GLMMs in the Madagascan (average AUC 0.83 vs 0.80), but not the Mexican study zone (average AUC 0.81 vs 0.89). In Mexico and Madagascar, Gaussian processes (average AUC 0.89, 0.85) and structured Bayesian networks (average AUC 0.88, 0.82) performed at least as well as GLMMs (average AUC 0.89, 0.80). Bayesian networks produced more stable results across different sampling methods. Gaussian processes performed well (average AUC 0.85) with fewer predictor variables.  相似文献   

16.
Model constructs in environmental models are seldom reused beyond the project lifetime or in other modelling studies. A library of reusable model components could facilitate the maintenance of existing models and make the design of new models more efficient. Although component-based design is the common standard in software engineering and manufacturing few examples are yet found in environmental science. The multi-disciplinary project SPICOSA used a common, component-based simulation framework for environmental modelling, based on 18 case studies through Europe. The development of high-quality model components with potential for reuse turned out to be a challenge despite of the guidelines and tutorial examples provided. Well-designed components are of appropriate granularity, encapsulated, with a limited use of connectors and proper data handling. Ultimately, the success of a model library depends on a sufficient set of quality components with complementary functionalities, a framework for quality control, and support of the environmental modelling community.  相似文献   

17.
Causal explanation and empirical prediction are usually addressed separately when modelling ecological systems. This potentially leads to erroneous conflation of model explanatory and predictive power, to predictive models that lack ecological interpretability, or to limited feedback between predictive modelling and theory development. These are fundamental challenges to appropriate statistical and scientific use of ecological models. To help address such challenges, we propose a novel, integrated modelling framework which couples explanatory modelling for causal understanding and input variable selection with a machine learning approach for empirical prediction. Exemplar datasets from the field of freshwater ecology are used to develop and evaluate the framework, based on 267 stream and river monitoring stations across England, UK. These data describe spatial patterns in benthic macroinvertebrate community indices that are hypothesised to be driven by meso-scale physical and chemical habitat conditions. Whilst explanatory models developed using structural equation modelling performed strongly (r2 for two macroinvertebrate indices = 0.64–0.70), predictive models based on extremely randomised trees demonstrated moderate performance (r2 for the same indices = 0.50–0.61). However, through coupling explanatory and predictive components, our proposed framework yields ecologically-interpretable predictive models which also maintain the parsimony and accuracy of models based on solely predictive approaches. This significantly enhances the opportunity for feedback among causal theory, empirical data and prediction within environmental modelling.  相似文献   

18.
In recent years, with increased opportunities to post content on social media, a number of users are experiencing information overload in relation to social media use. This study addresses how Japanese Twitter users suffering from information overload cope with their stress, focusing on two actions: (1) The “unfriending” activities and (2) The changes in tweet processing methods. Objective data, such as numbers of friends, were collected through Twitter's open Application Programming Interfaces (APIs), and subjective data, such as perceived information overload and tweet processing methods, were collected through a web-based survey as a panel dataset (n = 778). The results demonstrated that although users experience information overload, they continue to increase their number of friends, and that the users who experience information overload modify their usage habits to avoid seeing all received tweets. In short, users do not choose a strategy to reduce the absolute number of received tweets, but only a strategy that involves changing the processing method of the received tweets.  相似文献   

19.
ContextCritical systems in domains such as aviation, railway, and automotive are often subject to a formal process of safety certification. The goal of this process is to ensure that these systems will operate safely without posing undue risks to the user, the public, or the environment. Safety is typically ensured via complying with safety standards. Demonstrating compliance to these standards involves providing evidence to show that the safety criteria of the standards are met.ObjectiveIn order to cope with the complexity of large critical systems and subsequently the plethora of evidence information required for achieving compliance, safety professionals need in-depth knowledge to assist them in classifying different types of evidence, and in structuring and assessing the evidence. This paper is a step towards developing such a body of knowledge that is derived from a large-scale empirically rigorous literature review.MethodWe use a Systematic Literature Review (SLR) as the basis for our work. The SLR builds on 218 peer-reviewed studies, selected through a multi-stage process, from 4963 studies published between 1990 and 2012.ResultsWe develop a taxonomy that classifies the information and artefacts considered as evidence for safety. We review the existing techniques for safety evidence structuring and assessment, and further study the relevant challenges that have been the target of investigation in the academic literature. We analyse commonalities in the results among different application domains and discuss implications of the results for both research and practice.ConclusionThe paper is, to our knowledge, the largest existing study on the topic of safety evidence. The results are particularly relevant to practitioners seeking a better grasp on evidence requirements as well as to researchers in the area of system safety. As a major finding of the review, the results strongly suggest the need for more practitioner-oriented and industry-driven empirical studies in the area of safety certification.  相似文献   

20.
Environmental process modeling is challenged by the lack of high quality data, stochastic variations, and nonlinear behavior. Conventionally, parameter optimization is based on stochastic sampling techniques to deal with the nonlinear behavior of the proposed models. Despite widespread use, such tools cannot guarantee globally optimal parameter estimates. It can be especially difficult in practice to differentiate between lack of algorithm convergence, convergence to a non-global local optimum, and model structure deficits. For this reason, we use a deterministic global optimization algorithm for kinetic model identification and demonstrate it with a model describing a typical batch experiment. A combination of interval arithmetic, reformulations, and relaxations allows globally optimal identification of all (six) model parameters. In addition, the results suggest that further improvements may be obtained by modification of the optimization problem or by proof of the hypothesized pseudo-convex nature of the problem suggested by our results.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号