首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Current air quality models generate deterministic forecasts by assuming perfect model, perfectly known parameters, and exact input data. However, our knowledge of the physics is imperfect. It is of interest to extend the deterministic simulation results with “error bars” that quantify the degree of uncertainty, and analyze the impact of the uncertainty input on the simulation results. This added information provides a confidence level for the forecast results. Monte Carlo (MC) method is a popular approach for air quality model uncertainty analysis, but it converges slowly. This work discusses the polynomial chaos (PC) method that is more suitable for uncertainty quantification (UQ) in large-scale models. We propose a new approach for uncertainty apportionment (UA), i.e., we develop a PC approach to attribute the uncertainties in model results to different uncertainty inputs. The UQ and UA techniques are implemented in the Sulfur Transport Eulerian Model (STEM-III). A typical scenario of air pollution in the northeast region of the USA is considered. The UQ and UA results allow us to assess the combined effects of different input uncertainties on the forecast uncertainty. They also enable to quantify the contribution of input uncertainties to the uncertainty in the predicted ozone and PAN concentrations.  相似文献   

2.
Reliability-based design optimization (RBDO) has been widely used to design engineering products with minimum cost function while meeting reliability constraints. Although uncertainties, such as aleatory uncertainty and epistemic uncertainty, have been well considered in RBDO, they are mainly considered for model input parameters. Model uncertainty, i.e., the uncertainty of model bias indicating the inherent model inadequacy for representing the real physical system, is typically overlooked in RBDO. This paper addresses model uncertainty approximation in a product design space and further integrates the model uncertainty into RBDO. In particular, a copula-based bias modeling approach is proposed and results are demonstrated by two vehicle design problems.  相似文献   

3.
The Sea-Level Affecting Marshes Model was applied to coastal New York State at a 5 m horizontal resolution to investigate marsh conservation and potential migration under multiple sea-level rise scenarios. Feedbacks between sea-level rise and marsh accretion rates based on mechanistic modeling were included. Simulation results predict extensive marsh losses in microtidal regimes behind the barrier islands of Long Island, vulnerable dry lands on barrier islands, and opportunities for upland migration of coastal marshes. Results also indicate changes in the composition of marsh types. Confidence of predictions due to model parameter variabilities and spatial data error were estimated with the uncertainty estimation module. Likelihood maps of land cover changes were produced. Uncertainty results suggest that variability in land cover projections is mostly due to the wide range in potential sea-level rise signals by 2100 while impact from uncertainties in model parameters, spatial data errors and linked models is less significant.  相似文献   

4.
一种基于非单调逻辑的模型管理方法   总被引:2,自引:0,他引:2  
蓝红兵  费奇 《自动化学报》1992,18(4):414-420
本文讨论了模型管理中不确定性的表达、传递、证据合成以及问题求解过程,提出了一种 基于非单调逻辑的模型管理方法:将模型结构形式的不确定性表示为由建模者或领域专家对 问题结构中未知或随机情形所作假设集支持的可能性命题;模型之间不确定性关系的管理通 过对假设环境的真值(一致性)保持和信度调整过程来实现,其依据是在问题求解过程中出现 的冲突情形或者是由决策人提供的有关命题或次判断.  相似文献   

5.
Land cover change (LCC) can have a significant impact on human and environmental well-being. LCC maps derived from historical remote sensing (RS) images are often used to evaluate the impacts of past LC changes and to construct models to predict future LC changes. Free moderate spatial resolution (~ 30 m) optical and synthetic aperture radar (SAR) RS imagery is now becoming increasingly available for this LCC monitoring. However, the classification algorithms used to extract LC information from these images typically require “training data” for classification (i.e. points or polygons with LC class labels), and acquiring this labelled training data can be difficult and time-consuming. Alternatively, crowdsourced geographic data (CGD) has become widely available from online sources like OpenStreetMap (OSM), and it may provide a useful source of training data for LCC monitoring. A major challenge with utilizing CGD for LCC mapping, however, is the presence of class labelling errors, and these errors can vary spatially (e.g. due to differing levels of CGD contributor expertise) and temporally (e.g. due to time lag between CGD creation and RS imagery acquisition). In this study, we investigated a new LCC mapping method which utilizes free Landsat (optical) and PALSAR mosaic (SAR) satellite imagery in combination with labelled LC data extracted from CGD sources (the OSM “landuse” and “natural” polygon datasets). A semi-unsupervised classification approach was employed for the LCC mapping to reduce the effects of class label noise in the CGD. The main motivation and benefit of the proposed method is that it does not require training data to be manually collected, allowing for a faster and more automated assessment of LCC. As a case study, we applied the method to map LCC in the Laguna de Bay area of the Philippines over the 2007–2015 period. The LCC map produced using our proposed approach achieved an overall classification accuracy of 90.2%, providing evidence that CGD and multi-temporal/multi-sensor satellite imagery, when combined, have a great potential for LCC monitoring.  相似文献   

6.
Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component analysis is proposed to suppress the multicollinearities in order to make these Jacobians robust and physically meaningful.  相似文献   

7.
In this paper, we develop a sliding mode control architecture to control lung volume and minute ventilation in the presence of modelling system uncertainties. Since the applied input pressure to the lungs is, in general, nonnegative and cannot be arbitrarily large, as not to damage the lungs, a sliding mode control with bounded nonnegative control inputs is proposed. The controller only uses output information (i.e., the total volume of the lungs) and automatically adjusts the applied input pressure so that the system is able to track a given reference signal in the presence of parameter uncertainty (i.e., modelling uncertainty of the lung resistances and lung compliances) and system disturbances. Controllers for both matched and unmatched uncertainties are presented. Specifically, a Lyapunov-based approach is presented for the stability analysis of the system and the proposed control framework is applied to a two-compartment lung model to show the efficacy of the proposed control method.  相似文献   

8.
There are two main issues of concern for land change scientists to consider. First, selecting appropriate and independent land cover change (LCC) drivers is a substantial challenge because these drivers usually correlate with each other. For this reason, we used a well-known machine learning tool called genetic algorithm (GA) to select the optimum LCC drivers. In addition, using the best or most appropriate LCC model is critical since some of them are limited to a specific function, to discover non-linear patterns within land use data. In this study, a support vector regression (SVR) was implemented to model LCC as SVRs use various linear and non-linear kernels to better identify non-linear patterns within land use data. With such an approach, choosing the appropriate kernels to model LCC is critical because SVR kernels have a direct impact on the accuracy of the model. Therefore, various linear and non-linear kernels, including radial basis function (RBF), sigmoid (SIG), polynomial (PL) and linear (LN) kernels, were used across two phases: 1) in combination with GA, and 2) without GA present. The simulated maps resulting from each combination were evaluated using a recently modified version of the receiver operating characteristics (ROC) tool called the total operating characteristic (TOC) tool. The proposed approach was applied to simulate urban growth in Rasht County, which is located in the north of Iran. As a result, an SVR-GA-RBF model achieved the highest area under curve (AUC) value at 94% while the lowest AUC was achieved when using the SVR-LN model at 71%. The results show that the synergy between GA and SVR can effectively optimize the variables selection process used when developing an LCC model, and can enhance the predictive accuracy of SVR.  相似文献   

9.
Cellular Automata (CA) models are widely used to study spatial dynamics of urban growth and evolving patterns of land use. One complication across CA approaches is the relatively short period of data available for calibration, providing sparse information on patterns of change and presenting problematic signal-to-noise ratios. To overcome the problem of short-term calibration, this study investigates a novel approach in which the model is calibrated based on the urban morphological patterns that emerge from a simulation starting from urban genesis, i.e., a land cover map completely void of urban land. The application of the model uses the calibrated parameters to simulate urban growth forward in time from a known urban configuration.This approach to calibration is embedded in a new framework for the calibration and validation of a Constrained Cellular Automata (CCA) model of urban growth. The investigated model uses just four parameters to reflect processes of spatial agglomeration and preservation of scarce non-urban land at multiple spatial scales and makes no use of ancillary layers such as zoning, accessibility, and physical suitability. As there are no anchor points that guide urban growth to specific locations, the parameter estimation uses a goodness-of-fit (GOF) measure that compares the built density distribution inspired by the literature on fractal urban form. The model calibration is a novel application of Markov Chain Monte Carlo Approximate Bayesian Computation (MCMC-ABC). This method provides an empirical distribution of parameter values that reflects model uncertainty. The validation uses multiple samples from the estimated parameters to quantify the propagation of model uncertainty to the validation measures.The framework is applied to two UK towns (Oxford and Swindon). The results, including cross-application of parameters, show that the models effectively capture the different urban growth patterns of both towns. For Oxford, the CCA correctly produces the pattern of scattered growth in the periphery, and for Swindon, the pattern of compact, concentric growth. The ability to identify different modes of growth has both a theoretical and practical significance. Existing land use patterns can be an important indicator of future trajectories. Planners can be provided with insight in alternative future trajectories, available decision space, and the cumulative effect of parcel-by-parcel planning decisions.  相似文献   

10.
In the reliability analysis, input variables as well as the metamodel uncertainties are often encountered in practice. The input uncertainty includes the statistical uncertainty of the distribution parameters due to the lack of knowledge or insufficient data. Metamodel uncertainty arises when the response function is approximated by a surrogate function using a finite number of responses to reduce the costly computations. In this study, a reliability analysis procedure is proposed based on a Bayesian framework that can incorporate these uncertainties in an integrated manner into the form of posterior PDF. The PDF, often expressed by arbitrary functions, is evaluated via Markov Chain Monte Carlo (MCMC) method, which is an efficient simulation method to draw random samples that follow the distribution. In order to avoid the nested computation in the full Bayesian approach, a posterior predictive approach is employed, which requires only a single loop of reliability analysis. Gaussian process model is employed for the metamodel. Mathematical and engineering examples are used to demonstrate the proposed method. In the results, comparing with the full Bayesian approach, the predictive approach provides much less information, i.e., only a point estimate of the probability. Nevertheless, the predictive approach adequately accounts for the uncertainties with much less computation, which is more advantageous in the design practice. The smaller the data are provided, the higher the statistical uncertainty, leading to the higher (or lower) failure probability (or reliability).  相似文献   

11.
In Part 1 of this two-part paper, we bounded the centroid of a symmetric interval type-2 fuzzy set (T2 FS), and consequently its uncertainty, using geometric properties of its footprint of uncertainty (FOU). We then used these bounds to solve forward problems, i.e., to go from parametric interval T2 FS models to data. The main purpose of the present paper is to formulate and solve inverse problems, i.e., to go from uncertain data to parametric interval T2 FS models, which we call type-2 fuzzistics. Given interval data collected from people about a phrase, and the inherent uncertainties associated with that data, which can be described statistically using the first- and second-order statistics about the end-point data, we establish parametric FOUs such that their uncertainty bounds are directly connected to statistical uncertainty bounds. These results should find applicability in computing with words  相似文献   

12.
In software reliability modeling, the parameters of the model are typically estimated from the test data of the corresponding component. However, the widely used point estimators are subject to random variations in the data, resulting in uncertainties in these estimated parameters. Ignoring the parameter uncertainty can result in grossly underestimating the uncertainty in the total system reliability. This paper attempts to study and quantify the uncertainties in the software reliability modeling of a single component with correlated parameters and in a large system with numerous components. Another characteristic challenge in software testing and reliability is the lack of available failure data from a single test, which often makes modeling difficult. This lack of data poses a bigger challenge in the uncertainty analysis of the software reliability modeling. To overcome this challenge, this paper proposes utilizing experts' opinions and historical data from previous projects to complement the small number of observations to quantify the uncertainties. This is done by combining the maximum-entropy principle (MEP) into the Bayesian approach. This paper further considers the uncertainty analysis at the system level, which contains multiple components, each with its respective model/parameter/ uncertainty, by using a Monte Carlo approach. Some examples with different modeling approaches (NHPP, Markov, Graph theory) are illustrated to show the generality and effectiveness of the proposed approach. Furthermore, we illustrate how the proposed approach for considering the uncertainties in various components improves a large-scale system reliability model.  相似文献   

13.
Within the past decade, several global land cover data sets derived from satellite observations have become available to the scientific community. They offer valuable information on the current state of the Earth's land surface. However, considerable disagreements among them and classification legends not primarily suited for specific applications such as carbon cycle model parameterizations pose significant challenges and uncertainties in the use of such data sets.This paper addresses the user community of global land cover products. We first review and compare several global land cover products, i.e. the Global Land Cover Characterization Database (GLCC), Global Land Cover 2000 (GLC2000), and the MODIS land cover product, and highlight individual strengths and weaknesses of mapping approaches. Our overall objective is to present a straightforward method that merges existing products into a desired classification legend. This process follows the idea of convergence of evidence and generates a ‘best-estimate’ data set using fuzzy agreement. We apply our method to develop a new joint 1-km global land cover product (SYNMAP) with improved characteristics for land cover parameterization of the carbon cycle models that reduces land cover uncertainties in carbon budget calculations.The overall advantage of the SYNMAP legend is that all classes are properly defined in terms of plant functional type mixtures, which can be remotely sensed and include the definitions of leaf type and longevity for each class with a tree component. SYNMAP is currently used for parameterization in a European model intercomparison initiative of three global vegetation models: BIOME-BGC, LPJ, and ORCHIDEE.Corroboration of SYNMAP against GLCC, GLC2000 and MODIS land cover products reveals improved agreement of SYNMAP with all other land cover products and therefore indicates the successful exploration of synergies between the different products. However, given that we cannot provide extensive validation using reference data we are unable to prove that SYNMAP is actually more accurate. SYNMAP is available on request from Martin Jung.  相似文献   

14.
Soil carbon (C) responds quickly and feedbacks significantly to environmental changes such as climate warming and agricultural management. Soil C modelling is the only reasonable approach available for predicting soil C dynamics under future conditions of environmental changes, and soil C models are usually constrained by the average of observations. However, model constraining is sensitive to the observed data, and the consequence of using observed averages on C predictions has rarely been studied. Using long-term soil organic C datasets from an agricultural field experiment, we constrained a process-based model using the average of observations or by taking into account the variation in observations to predict soil C dynamics. We found that uncertainties in soil C predictions were masked if ignoring the uncertainties in observations (i.e., using the average of observations to constrain model), if uncertainties in model parameterisation were not explicitly quantified. However, if uncertainties in model parameterisation had been considered, further considering uncertainties in observations had negligible effect on uncertainties in SOC predictions. The results suggest that uncertainties induced by model parameterisation are larger than that induced by observations. Precise observations representing the real spatial pattern of SOC at the studied domain, and model structure improvement and constrained space of parameters will benefit reducing uncertainties in soil C predictions. The results also highlight some areas on which future C model development and software implementations should focus to reliably infer soil C dynamics.  相似文献   

15.
This study explores the use of generalized polynomial chaos theory for modeling complex nonlinear multibody dynamic systems in the presence of parametric and external uncertainty. The polynomial chaos framework has been chosen because it offers an efficient computational approach for the large, nonlinear multibody models of engineering systems of interest, where the number of uncertain parameters is relatively small, while the magnitude of uncertainties can be very large (e.g., vehicle-soil interaction). The proposed methodology allows the quantification of uncertainty distributions in both time and frequency domains, and enables the simulations of multibody systems to produce results with “error bars”. The first part of this study presents the theoretical and computational aspects of the polynomial chaos methodology. Both unconstrained and constrained formulations of multibody dynamics are considered. Direct stochastic collocation is proposed as less expensive alternative to the traditional Galerkin approach. It is established that stochastic collocation is equivalent to a stochastic response surface approach. We show that multi-dimensional basis functions are constructed as tensor products of one-dimensional basis functions and discuss the treatment of polynomial and trigonometric nonlinearities. Parametric uncertainties are modeled by finite-support probability densities. Stochastic forcings are discretized using truncated Karhunen-Loeve expansions. The companion paper “Modeling Multibody Dynamic Systems With Uncertainties. Part II: Numerical Applications” illustrates the use of the proposed methodology on a selected set of test problems. The overall conclusion is that despite its limitations, polynomial chaos is a powerful approach for the simulation of multibody systems with uncertainties.  相似文献   

16.
具有大不确定性对象的分层切换控制方法   总被引:1,自引:0,他引:1  
高锋  李克强  王建强  连小珉 《控制工程》2007,14(3):297-300,324
针对大模型不确定性对象的控制问题,提出了一种基于鲁棒控制理论的多模型分层切换控制方法.为减少覆盖不确定性需要的模型数量,采用多个乘性不确定模型来描述对象,并应用LMI方法设计控制器集合.考虑鲁棒控制中常用系统增益来度量不确定性,设计了一种基于不确定性增益估计的切换指标函数,并据此将控制器集合中合适的控制器连接到反馈回路中.理论分析表明系统BIBO稳定,且具有一定的扰动抑制能力.仿真实验结果验证了控制方法的有效性.  相似文献   

17.
Land cover maps provide essential input data for various hydromorphological and ecological models, but the effect of land cover classification errors on these models has not been quantified systematically. This paper presents the uncertainty in hydromorphological and ecological model output for a large lowland river depending on the classification accuracy (CA) of a land cover map. Using four different models, we quantified the uncertainty for the three distributaries of the Rhine River in The Netherlands with respect to: (1) hydrodynamics (WAQUA model), (2) annual average suspended sediment deposition (SEDIFLUX model), (3) ecotoxicological hazards of contaminated sediment for a bird of prey, and (4) floodplain importance for desired habitat types and species (BIO-SAFE model). We carried out two Monte Carlo (n = 15) analyses: one at a 69% land cover CA, the other at 95% CA. Subsequently we ran all four models with the 30 realizations as input.The error in the current land cover map gave an uncertainty in design water levels of up to 19 cm. Overbank sediment deposition varied up to 100% in the area bordering the main channel, but when aggregated to the whole study area, the variation in sediment trapping efficiency was negligible. The ecotoxicological hazards, represented by the fraction of Little Owl habitat with potential cadmium exposure levels exceeding a corresponding toxicity threshold of 148 μg d−1, varied between 54 and 60%, aggregated over the distributaries. The 68% confidence interval of floodplain importance for protected and endangered species varied between 10 and 15%. Increasing the classification accuracy to 95% significantly lowered the uncertainty of all models applied. Compared to landscaping measures, the effects due to the uncertainty in the land cover map are of the same order of magnitude. Given high financial costs of these landscaping measures, increasing the classification accuracy of land cover maps is a prerequisite for improving the assessment of the efficiency of landscaping measures.  相似文献   

18.
Design of controllers for uncertain systems is inherently paradoxical. Adaptive control approaches claim to adapt system parameters against uncertainties, but only if these uncertainties change slowly enough. Alternatively, robust control methodologies claim to ensure system stability against uncertainties, but only if these uncertainties remain within known bounds. This is while, in reality, disturbances and uncertainties remain faithfully uncertain, i.e., may be both fast and large. In this paper, a PI-adaptive fuzzy control architecture for a class of uncertain nonlinear systems is proposed that aims to provide added robustness in the presence of large and fast but bounded uncertainties and disturbances. While the proposed approach requires the uncertainties to be bounded, it does not require this bound to be known. Lyapunov analysis is used to prove asymptotic stability of the proposed approach. Application of the proposed method to a second-order inverted pendulum system demonstrates the effectiveness of the proposed approach. Specifically, system responses to fast versus slow and large versus small disturbances are considered in the presented simulation studies.  相似文献   

19.
The need to differentiate between epistemic and aleatory uncertainties is now well admitted by the risk analysis community. One way to do so is to model aleatory uncertainty by classical probability distributions and epistemic uncertainty by means of possibility distributions, and then propagate them by their respective calculus. The result of this propagation is a random fuzzy variable. When dealing with complex models, the computational cost of such a propagation quickly becomes too high. In this paper, we propose a numerical approach, the Random/Fuzzy (RaFu) method, whose aim is to determine an optimal numerical strategy so that computational costs are reduced to their minimum, using the theoretical frameworks mentioned above. We also give some means to take account of the resulting numerical error. The benefits of the RaFu method are shown by comparing it to previously proposed methods.  相似文献   

20.
ABSTRACT

Due to the instantaneous field-of-view (IFOV) of the sensor and diversity of land cover types, some pixels, usually named mixed pixels, contain more than one land cover type. Soft classification can predict the portion of each land cover type in mixed pixels in the absence of spatial distribution. The spatial distribution information in mixed pixels can be solved by super resolution mapping (SRM). Typically, SRM involves two steps: soft class value estimation, which is similar to the image super resolution of image restoration, and land cover allocation. A new SRM approach utilizes a deep image prior (DIP) strategy combined with a super resolution convolutional neural network (SRCNN) to estimate fine resolution fraction images for each land cover type; then, a simple and efficient classifier is used to allocate subpixel land cover types under the constraint of the generated fine fraction images. The proposed approach can use prior information of input images to update network parameters and no longer require training data. Experiments on three different cases demonstrate that the subpixel classification accuracy of the proposed DIP-based SRM approach is significantly better than the three conventional SRM approaches and a transfer learning-based neural network SRM approach. In addition, the DIP-SRM approach performs very robustly about small-area objects within multiple land cover types and significantly reduces soft classification uncertainty. The results of this paper provide an extension for utilizing SRCNN to address SRM issues in hyperspectral images.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号