首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
Land change models are increasingly being employed to predict future landscapes and influence policy and decision-making. To ensure the highest model accuracy, validation methods have become commonplace following a land change simulation. The most common validation method employed uses quantity and allocation disagreement. However, these current measures may not account for differences in the configurations of land change, placing them in potential conflict with the principals of heterogeneity and spatial patterning of landscape ecology. We develop a new metric, termed configuration disagreement, designed to focus on the size, shape, and complexity of land change simulations. Using this metric, we demonstrate the value of including errors of configuration disagreement – in addition to quantity and allocation error – in the assessment of land change models. Four computational experiments of land change that vary only in spatial pattern are developed using the FUTURES land change model. For each experiment, configuration disagreement and the traditional validation metrics are computed simultaneously. Results indicate that models validated only with consideration of quantity and allocation error may misrepresent, or not fully account for, spatial patterns of landscape change. The research objective will ultimately guide which component, or components, of model disagreement are most critical for consideration. Yet, our work reveals why it may be more helpful to validate simulations in terms of configuration accuracy. Specifically, if a study requires accurately modeling the spatial patterns and arrangements of land cover. Configuration disagreement could add critical information with respect to a model's simulated changes in size, shape, and spatial arrangements, and possibly enhance ecologically meaningful land change science.  相似文献   

3.
基于Landsat TM 数据的若尔盖县LUCC 时空特征研究   总被引:5,自引:1,他引:5       下载免费PDF全文
若尔盖县是世界著名若尔盖湿地的主要组成部分, 是青藏高原高寒生态系统的典型代表。基于1989 年、1997 年和2004 年3 期Landsat TM 影像的土地利用ö土地覆被分类结果, 运用地理信息系统空间分析方法和数理统计学方法, 深入分析了四川省若尔盖县近15 年来各土地利用/覆被类型尤其是草地和沼泽的数量和空间变化特征。结果表明: ①研究区主要土地利用与土地覆被类型为草地、沼泽、林地和裸地, 其中草地与沼泽面积逐步减少, 而裸地面积成倍增长。②通过建立研究区LU CC 幅度、LU CC 数量和空间变化模型以及趋势与状态指数模型, 很好的表现了研究区LU CC 的时空特征。从整个区域来看, 前期综合LU CC 趋势和状态指数为0. 37, 处于准平衡状态;后期小于前期, 为0. 23, 处于平衡状态, 整个时段其指数为0. 35, 为准平衡状态, 呈现双向转换态势。③定位分析了研究区LU CC 情况, 发现区域草地和沼泽退化相当严重, 而且前后两期退化区在空间上有所转移。  相似文献   

4.
Land use/cover change (LUCC) is a major indicator of the impact of climate change and human activity, particularly in the Sahel, where the land cover has changed greatly over the past 50 years. Aerial and satellite sensors have been taking images of the Earth's surface for several decades. These data have been widely used to monitor LUCC, but many questions remain concerning what type of pre-processing should be carried out on image resolutions and which methods are most appropriate for successfully mapping patterns and dynamics in both croplands and natural vegetation. This study considers these methodological questions. It uses multi-source imagery from 1952 to 2003 (aerial photographs, Corona, Landsat Multispectral Scanner (MSS), Landsat Thematic Mapper (TM) and Satellite Pour l'Observation de la Terre (SPOT) 5 images) and pursues two objectives: (i) to implement and compare a number of processing chains on the basis of multi-sensor data, in order (ii) to accurately track and quantify LUCC in a 100 km2 Sahelian catchment over 50 years. The heterogeneity of the spatial and spectral resolution of the images led us to compare post-classification methods aimed at producing coherent diachronic maps based on a common land-cover nomenclature. Three main approaches were tested: pixel-based classification, vector grid-based on-screen interpretation and object-oriented classification. Within the automated approaches, we also examined the influence of spectral synthesis and spatial homogenization of the data through the use of composite bands (principal component analysis (PCA) and indices) and by resampling images at a common resolution. Classification accuracy was estimated by computing confusion matrices, by analysing overall change in the relative areas of land use/cover types and by studying the geographical coherence of the changes. These analyses indicate that on-screen interpretation is the most suitable approach for providing coherent, valid results from the multi-source images available over the study period. However, satisfactory classifications are obtained with the pixel-based and object-oriented approaches. The results also show significant sensitivity, depending on the method considered, to the combinations of bands used and to resampling. Lastly, the 50-year trends in LUCC point out a large increase in croplands and erosional surfaces with sparse vegetation and a drastic reduction in woody covers.  相似文献   

5.
Current software cost estimation models, such as the 1981 Constructive Cost Model (COCOMO) for software cost estimation and its 1987 Ada COCOMO update, have been experiencing increasing difficulties in estimating the costs of software developed to new life cycle processes and capabilities. These include non-sequential and rapid-development process models; reuse-driven approaches involving commercial off-the-shelf (COTS) packages, re-engineering, applications composition, and applications generation capabilities; object-oriented approaches supported by distributed middleware; and software process maturity initiatives. This paper summarizes research in deriving a baseline COCOMO 2.0 model tailored to these new forms of software development, including rationale for the model decisions. The major new modeling capabilities of COCOMO 2.0 are a tailorable family of software sizing models, involving Object Points, Function Points, and Source Lines of Code; nonlinear models for software reuse and re-engineering; an exponentdriver approach for modeling relative software diseconomies of scale; and several additions, deletions and updates to previous COCOMO effort-multiplier cost drivers. This model is serving as a framework for an extensive current data collection and analysis effort to further refine and calibrate the model's estimation capabilities.  相似文献   

6.
ContextConstructing bespoke software development methodologies for specific project situations has become a crucial need, giving rise to Situational Method Engineering (SME). Compared with Software Engineering, SME has a long way to go yet; SME approaches are especially deficient as to support for modeling, portability, and automation. Model-Driven Development (MDD) has been effectively used for addressing these issues in Software Engineering, and is also considered a promising approach for resolving them in SME.ObjectiveThis paper aims to address the shortcomings of existing SME approaches by introducing a novel MDD approach, specifically intended for SME purposes, that uses a pattern-based approach for model transformation.MethodDeveloping a MDD approach for SME requires that a modeling framework, consisting of modeling levels, be defined for modeling software development methodologies. Transformation patterns should also be specified for converting the models from one level to the next. A process should then be defined for applying the framework and transformations patterns to real SME projects. The resulting MDD approach requires proper evaluation to demonstrate its applicability.ResultsA framework and a semi-automated process have been proposed that adapt pattern-based model transformation techniques for application to the methodology models used in SME. The transformation patterns have been implemented in the Medini-QVT model transformation tool, along with two supplementary method bases: one for mapping the situational factors of SME projects to requirements, and the other for mapping the requirements to method fragments. The method engineer can produce the methodology models by using the method bases and executing the transformation patterns via the tool.ConclusionThe validity of the proposed approach has been assessed based on special evaluation criteria, and also through application to a real-world project. Evaluation results indicate that the proposed approach addresses the deficiencies of existing approaches, and satisfies the practicality requirements of SME approaches.  相似文献   

7.
Software architecture is a software system’s earliest set of design decisions that are critical for the quality of the system desired by the stakeholders. The architecture makes it easier to reason about and manage change during different phases of complex software life cycle. The modeling of software architecture for System of Systems (SoS) is a challenging task because of a system’s complexity arising from an integration of heterogeneous, distributed, managerially and operationally independent systems collaborating to achieve global missions. SoS is essentially dynamic and evolutionary by design requiring suitable architectural patterns to deal with runtime volatility. Service-oriented architecture offers several architectural features to these complex systems; these include, interoperability, loose coupling, abstraction and the provision of dynamic services based on standard interfaces and protocols. There is some research work available that provides critical analysis of current software architecture modeling approaches for SoS. However, none of them outlines the important characteristics of SoS or provides detailed analysis of current service-oriented architecture modeling approaches to model those characteristics. This article addresses this research gap and provides a taxonomy of software architecture modeling approaches, comparing and contrasting them using criteria critical for realization of SoS. Additionally, research gaps are identified, and future directions are outlined for building software architecture for SoS to model and reason about architecture quality in a more efficient way in service-oriented paradigm.  相似文献   

8.
9.
A tutorial on dependability and performance-related dependability models for multiprocessors is presented. Multiprocessors are classified as having shared-memory or distributed-memory architectures, and some fundamental dependability modeling concepts. Reliability models based on four types of reliability evaluation techniques (terminal, multiterminal, task-based, and network reliability) are examined. The status of research efforts on performance-related dependability is discussed, and the models' effectiveness is illustrated with a few numerical examples. A brief survey of software packages for dependability computation in included  相似文献   

10.
Our article illustrates how to compare the outputs from models that simulate transitions among categories through time. We illustrate the concepts by comparing two land change models: Land Change Modeler and Cellular Automata Markov. We show how the modeling options influence the quantity and allocation of simulated transitions, and how to compare output maps from pairs of model runs with respect to a reference map of transitions during the validation interval. We recommend that the first step is to assess the quantity of each transition and to determine the cause of the variation in quantity among model runs. The second step is to assess the allocation of transitions and to determine the cause of the variation in allocation among model runs. The separation of quantity and allocation of the transitions is a helpful approach to communicate how models work and to describe pattern validation.  相似文献   

11.
Cellular Automata (CA) models are widely used to study spatial dynamics of urban growth and evolving patterns of land use. One complication across CA approaches is the relatively short period of data available for calibration, providing sparse information on patterns of change and presenting problematic signal-to-noise ratios. To overcome the problem of short-term calibration, this study investigates a novel approach in which the model is calibrated based on the urban morphological patterns that emerge from a simulation starting from urban genesis, i.e., a land cover map completely void of urban land. The application of the model uses the calibrated parameters to simulate urban growth forward in time from a known urban configuration.This approach to calibration is embedded in a new framework for the calibration and validation of a Constrained Cellular Automata (CCA) model of urban growth. The investigated model uses just four parameters to reflect processes of spatial agglomeration and preservation of scarce non-urban land at multiple spatial scales and makes no use of ancillary layers such as zoning, accessibility, and physical suitability. As there are no anchor points that guide urban growth to specific locations, the parameter estimation uses a goodness-of-fit (GOF) measure that compares the built density distribution inspired by the literature on fractal urban form. The model calibration is a novel application of Markov Chain Monte Carlo Approximate Bayesian Computation (MCMC-ABC). This method provides an empirical distribution of parameter values that reflects model uncertainty. The validation uses multiple samples from the estimated parameters to quantify the propagation of model uncertainty to the validation measures.The framework is applied to two UK towns (Oxford and Swindon). The results, including cross-application of parameters, show that the models effectively capture the different urban growth patterns of both towns. For Oxford, the CCA correctly produces the pattern of scattered growth in the periphery, and for Swindon, the pattern of compact, concentric growth. The ability to identify different modes of growth has both a theoretical and practical significance. Existing land use patterns can be an important indicator of future trajectories. Planners can be provided with insight in alternative future trajectories, available decision space, and the cumulative effect of parcel-by-parcel planning decisions.  相似文献   

12.
The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to address this problem, but much work remains before EMFs are adopted as mainstream modeling tools. Environmental model development requires both scientific understanding of environmental phenomena and software developer proficiency. EMFs support the modeling process through streamlining model code development, allowing seamless access to data, and supporting data analysis and visualization. EMFs also support aggregation of model components into functional units, component interaction and communication, temporal-spatial stepping, scaling of spatial data, multi-threading/multi-processor support, and cross-language interoperability. Some EMFs additionally focus on high-performance computing and are tailored for particular modeling domains such as ecosystem, socio-economic, or climate change research. The Object Modeling System Version 3 (OMS3) EMF employs new advances in software framework design to better support the environmental model development process. This paper discusses key EMF design goals/constraints and addresses software engineering aspects that have made OMS3 framework development efficacious and its application practical, as demonstrated by leveraging software engineering efforts outside of the modeling community and lessons learned from over a decade of EMF development. Software engineering approaches employed in OMS3 are highlighted including a non-invasive lightweight framework design supporting component-based model development, use of implicit parallelism in system design, use of domain specific language design patterns, and cloud-based support for computational scalability. The key advancements in EMF design presented herein may be applicable and beneficial for other EMF developers seeking to better support environmental model development through improved framework design.  相似文献   

13.
When examining the relationship between landscape characteristics and water quality, most previous studies did not pay enough attention to the spatial aspects of landscape characteristics and water quality sampling stations. We analyzed the spatial pattern of total nitrogen (TN), total phosphorus (TP), chemical oxygen demand (COD), and suspended solids (SS) in the Han River basin of South Korea to explore the role of different distance considerations and spatial statistical approaches to explaining the variation in water quality. Five-year (2012 through 2016) seasonal averages of those water quality attributes were used in the analysis as the response variables, while explanatory variables like land cover, elevation, slope, and hydrologic soil groups were subjected to different weighting treatments based on distance and flow accumulation. Moran's Eigenvector-based spatial filters were used to consider spatial relations among water quality sampling sites and were used in regression models. Distinct spatial patterns of seasonal water quality exist, with the highest concentrations of TN, TP, COD, and SS in downstream urban areas and the lowest concentrations in upstream forest areas. TN concentrations are higher in dry winter than the wet summer season, while SS concentrations are higher in wet summer than the dry season. Spatial models substantially improved the model fit compared to aspatial models. The flow accumulation-based models performed best when the spatial filters were not used, but all models performed similarly when spatial filters were used. The distance weighting approaches were instrumental in understanding watershed level processes affecting source, mobilization, and delivery of physicochemical parameters that flow into the river water. We conclude that a consideration of the spatial aspects of sampling sites is as important as accounting for different distances and hydrological processes in modeling water quality.  相似文献   

14.
Species’ potential distribution modelling is the process of building a representation of the fundamental ecological requirements for a species and extrapolating these requirements into a geographical region. The importance of being able to predict the distribution of species is currently highlighted by issues like global climate change, public health problems caused by disease vectors, anthropogenic impacts that can lead to massive species extinction, among other challenges. There are several computational approaches that can be used to generate potential distribution models, each achieving optimal results under different conditions. However, the existing software packages available for this purpose typically implement a single algorithm, and each software package presents a new learning curve to the user. Whenever new software is developed for species’ potential distribution modelling, significant duplication of effort results because many feature requirements are shared between the different packages. Additionally, data preparation and comparison between algorithms becomes difficult when using separate software applications, since each application has different data input and output capabilities. This paper describes a generic approach for building a single computing framework capable of handling different data formats and multiple algorithms that can be used in potential distribution modelling. The ideas described in this paper have been implemented in a free and open source software package called openModeller. The main concepts of species’ potential distribution modelling are also explained and an example use case illustrates potential distribution maps generated by the framework.  相似文献   

15.
The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model—machine learning (ML) residuals sequential simulations—MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. ML algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process.  相似文献   

16.
土地利用最佳模拟尺度选择及空间格局模拟   总被引:1,自引:0,他引:1  
土地利用变化是一个受到多重因素相互影响的动态过程。目前,已经成为全球环境变化和可持续发展的重要内容,而区域土地利用空间格局模拟已成为LUCC研究的关键内容。以2000年以及2010年的TM遥感影像解译数据以及数字高程模型、水系、铁路、公路、降雨量和气温等数据为基础,运用二元逻辑斯蒂回归模型对黄土台塬区的土地利用最佳模拟尺度进行了选择,并在此基础上对研究区的各种土地利用进行了空间格局模拟。研究结果显示:(1)在土地利用格局模拟的十个空间尺度上,土地利用变化空间格局与其驱动力因子之间存在着一定的尺度相关性特征;(2)黄土台塬区耕地、林地、草地的ROC值在十个空间尺度上均呈现出先增加后减少的趋势,转折点在400 m尺度附近,说明黄土台塬区的土地利用在尺度效应和尺度转换的效应下,400 m×400 m是此区域土地利用格局优化的最佳模拟尺度;(3)在400 m最佳模拟尺度上所模拟出的草地和林地的分布格局都与人均GDP和地形综合指数两个变量显著相关,而对耕地的分布影响最为明显的变量则是地形综合指数。  相似文献   

17.
The POEMS project is creating an environment for end-to-end performance modeling of complex parallel and distributed systems, spanning the domains of application software, runtime and operating system software, and hardware architecture. Toward this end, the POEMS framework supports composition of component models from these different domains into an end-to-end system model. This composition can be specified using a generalized graph model of a parallel system, together with interface specifications that carry information about component behaviors and evaluation methods. The POEMS Specification Language compiler will generate an end-to-end system model automatically from such a specification. The components of the target system may be modeled using different modeling paradigms and at various levels of detail. Therefore, evaluation of a POEMS end-to-end system model may require a variety of evaluation tools including specialized equation solvers, queuing network solvers, and discrete event simulators. A single application representation based on static and dynamic task graphs serves as a common workload representation for all these modeling approaches. Sophisticated parallelizing compiler techniques allow this representation to be generated automatically for a given parallel program. POEMS includes a library of predefined analytical and simulation component models of the different domains and a knowledge base that describes performance properties of widely used algorithms. The paper provides an overview of the POEMS methodology and illustrates several of its key components. The modeling capabilities are demonstrated by predicting the performance of alternative configurations of Sweep3D, a benchmark for evaluating wavefront application technologies and high-performance, parallel architectures.  相似文献   

18.
Interoperability of software is a critical requirement in the architecture, engineering, and construction (AEC) industry, where a number of data exchange standards have been created to enable data exchange among different software packages. To be able to comply with existing data exchange standards, the software developers need to match their internal data schemas to the schema defined in a standard and vice versa. The process of matching two large scale data models is time consuming and cumbersome when performed manually, and becomes even more challenging when a source and/or a target model is being updated frequently to meet the ever expanding real world requirements. While several prior studies discussed the need for approaches toward automated or semi-automated schema matching, an approach that builds on existing matches between two models has rarely been studied. In this paper, we present a semi-automated approach for model matching. This approach leverages a given set of existing matching between two models and upgrades those matching when a new version of a target model is released. The paper describes in detail a list of upgrade patterns generated and validated through a prototype by matching a domain-specific data model to several recent releases of the industry foundation classes.  相似文献   

19.
针对不同成像机理的光学与雷达遥感数据协同应用于地表信息提取瓶颈问题,提出了一种基于地形信息的光学与雷达数据协同分类方法。首先利用InSAR测量技术从Radarsat-2数据中提取DEM地形信息,然后构建基于地形信息的Landsat光学数据和Radarsat-2雷达数据的不同特征集输入模型,最后通过随机样本选取构建随机森林(Random Forest,RF)、支持向量机(Support Vector Machine, SVM)和决策树(Decision Tree,DT)分类算法模型提取地表信息。结果表明:①针对不同特征协同策略,在随机选取10%训练样本时,Radarsat-2干涉提取DEM与Landsat数据集提取精度优于ASTER GDEM与光学影像协同策略;②针对不同地表信息提取算法模型,通过50次随机选取训练样本构建模型评价分类精度,验证RF算法的鲁棒性和提取精度都要优于DT算法和SVM算法。研究充分利用光学和雷达遥感的优势信息,为光学和雷达遥感协同地表信息提取提供新的思路。  相似文献   

20.
针对不同成像机理的光学与雷达遥感数据协同应用于地表信息提取瓶颈问题,提出了一种基于地形信息的光学与雷达数据协同分类方法。首先利用InSAR测量技术从Radarsat-2数据中提取DEM地形信息,然后构建基于地形信息的Landsat光学数据和Radarsat-2雷达数据的不同特征集输入模型,最后通过随机样本选取构建随机森林(Random Forest,RF)、支持向量机(Support Vector Machine, SVM)和决策树(Decision Tree,DT)分类算法模型提取地表信息。结果表明:①针对不同特征协同策略,在随机选取10%训练样本时,Radarsat-2干涉提取DEM与Landsat数据集提取精度优于ASTER GDEM与光学影像协同策略;②针对不同地表信息提取算法模型,通过50次随机选取训练样本构建模型评价分类精度,验证RF算法的鲁棒性和提取精度都要优于DT算法和SVM算法。研究充分利用光学和雷达遥感的优势信息,为光学和雷达遥感协同地表信息提取提供新的思路。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号