首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Efficient sampling strategies that scale with the size of the problem, computational budget, and users’ needs are essential for various sampling-based analyses, such as sensitivity and uncertainty analysis. In this study, we propose a new strategy, called Progressive Latin Hypercube Sampling (PLHS), which sequentially generates sample points while progressively preserving the distributional properties of interest (Latin hypercube properties, space-filling, etc.), as the sample size grows. Unlike Latin hypercube sampling, PLHS generates a series of smaller sub-sets (slices) such that (1) the first slice is Latin hypercube, (2) the progressive union of slices remains Latin hypercube and achieves maximum stratification in any one-dimensional projection, and as such (3) the entire sample set is Latin hypercube. The performance of PLHS is compared with benchmark sampling strategies across multiple case studies for Monte Carlo simulation, sensitivity and uncertainty analysis. Our results indicate that PLHS leads to improved efficiency, convergence, and robustness of sampling-based analyses.  相似文献   

2.
Development of robust dynamical systems and networks such as autonomous aircraft systems capable of accomplishing complex missions faces challenges due to the dynamically evolving uncertainties coming from model uncertainties, necessity to operate in a hostile cluttered urban environment, and the distributed and dynamic nature of the communication and computation resources. Model-based robust design is difficult because of the complexity of the hybrid dynamic models including continuous vehicle dynamics, the discrete models of computations and communications, and the size of the problem. We will overview recent advances in methodology and tools to model, analyze, and design robust autonomous aerospace systems operating in uncertain environment, with stress on efficient uncertainty quantification and robust design using the case studies of the mission including model-based target tracking and search, and trajectory planning in uncertain urban environment. To show that the methodology is generally applicable to uncertain dynamical systems, we will also show examples of application of the new methods to efficient uncertainty quantification of energy usage in buildings, and stability assessment of interconnected power networks.  相似文献   

3.
Parameter uncertainty and sensitivity for a watershed-scale simulation model in Portugal were explored to identify the most critical model parameters in terms of model calibration and prediction. The research is intended to help provide guidance regarding allocation of limited data collection and model parameterization resources for modelers working in any data and resource limited environment. The watershed-scale hydrology and water quality simulation model, Hydrologic Simulation Program – FORTRAN (HSPF), was used to predict the hydrology of Lis River basin in Portugal. The model was calibrated for a 5-year period 1985–1989 and validated for a 4-year period 2003–2006. Agreement between simulated and observed streamflow data was satisfactory considering the performance measures such as Nash–Sutcliffe efficiency (E), deviation runoff (Dv) and coefficient of determination (R2). The Generalized Likelihood Uncertainty Estimation (GLUE) method was used to establish uncertainty bounds for the simulated flow using the Nash–Sutcliffe coefficient as a performance likelihood measure. Sensitivity analysis results indicate that runoff estimations are most sensitive to parameters related to climate conditions, soil and land use. These results state that even though climate conditions are generally most significant in water balance modeling, attention should also focus on land use characteristics as well. Specifically with respect to HSPF, the two most sensitive parameters, INFILT and LZSN, are both directly dependent on soil and land use characteristics.  相似文献   

4.
This paper provides a review of various non-traditional uncertainty models for engineering computation and responds to the criticism of those models. This criticism imputes inappropriateness in representing uncertain quantities and an absence of numerically efficient algorithms to solve industry-sized problems. Non-traditional uncertainty models, however, run counter to this criticism by enabling the solution of problems that defy an appropriate treatment with traditional probabilistic computations due to non-frequentative characteristics, a lack of available information, or subjective influences. The usefulness of such models becomes evident in many cases within engineering practice. Examples include: numerical investigations in the early design stage, the consideration of exceptional environmental conditions and socio-economic changes, and the prediction of the behavior of novel materials based on limited test data. Non-traditional uncertainty models thus represent a beneficial supplement to the traditional probabilistic model and a sound basis for decision-making. In this paper non-probabilistic uncertainty modeling is discussed by means of interval modeling and fuzzy methods. Mixed, probabilistic/non-probabilistic uncertainty modeling is dealt with in the framework of imprecise probabilities possessing the selected components of evidence theory, interval probabilities, and fuzzy randomness. The capabilities of the approaches selected are addressed in view of realistic modeling and processing of uncertain quantities in engineering. Associated numerical methods for the processing of uncertainty through structural computations are elucidated and considered from a numerical efficiency perspective. The benefit of these particular developments is emphasized in conjunction with the meaning of the uncertain results and in view of engineering applications.  相似文献   

5.
Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin near Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.  相似文献   

6.
7.
Statistical calibration of model parameters conditioned on observations is performed in a Bayesian framework by evaluating the joint posterior probability density function (pdf) of the parameters. The posterior pdf is very often inferred by sampling the parameters with Markov Chain Monte Carlo (MCMC) algorithms. Recently, an alternative technique to calculate the so-called Maximal Conditional Posterior Distribution (MCPD) appeared. This technique infers the individual probability distribution of a given parameter under the condition that the other parameters of the model are optimal. Whereas the MCMC approach samples probable draws of the parameters, the MCPD samples the most probable draws when one of the parameters is set at various prescribed values. In this study, the results of a user-friendly MCMC sampler called DREAM(ZS) and those of the MCPD sampler are compared. The differences between the two approaches are highlighted before running a comparison inferring two analytical distributions with collinearity and multimodality. Then, the performances of both samplers are compared on an artificial multistep outflow experiment from which the soil hydraulic parameters are inferred. The results show that parameter and predictive uncertainties can be accurately assessed with both the MCMC and MCPD approaches.  相似文献   

8.
Sensitivity analysis for the quantified uncertainty in evidence theory is developed. In reliability quantification, classical probabilistic analysis has been a popular approach in many engineering disciplines. However, when we cannot obtain sufficient data to construct probability distributions in a large-complex system, the classical probability methodology may not be appropriate to quantify the uncertainty. Evidence theory, also called Dempster–Shafer Theory, has the potential to quantify aleatory (random) and epistemic (subjective) uncertainties because it can directly handle insufficient data and incomplete knowledge situations. In this paper, interval information is assumed for the best representation of imprecise information, and the sensitivity analysis of plausibility in evidence theory is analytically derived with respect to expert opinions and structural parameters. The results from the sensitivity analysis are expected to be very useful in finding the major contributors for quantified uncertainty and also in redesigning the structural system for risk minimization.  相似文献   

9.
In this study, a hybrid sequential data assimilation and probabilistic collocation (HSDAPC) approach is proposed for analyzing uncertainty propagation and parameter sensitivity of hydrologic models. In HSDAPC, the posterior probability distributions of model parameters are first estimated through a particle filter method based on streamflow discharge data. A probabilistic collocation method (PCM) is further employed to show uncertainty propagation from model parameters to model outputs. The temporal dynamics of parameter sensitivities are then generated based on the polynomial chaos expansion (PCE) generated by PCM, which can reveal the dominant model components for different catchment conditions. The maximal information coefficient (MIC) is finally employed to characterize the correlation/association between model parameter sensitivity and catchment precipitation, potential evapotranspiration and observed discharge. The proposed method is applied to the Xiangxi River located in the Three Gorges Reservoir area. The results show that: (i) the proposed HSDAPC approach can generate effective 2nd and 3rd PCE models which provide accuracy predictions; (ii) 2nd-order PCE, which can run nearly ten time faster than the hydrologic model, can capably represent the original hydrological model to show the uncertainty propagation in a hydrologic simulation; (iii) the slow (Rs) and quick flows (Rq) in Hymod show significant sensitivities during the simulation periods but the distribution factor (α) shows a least sensitivity to model performance; (iv) the model parameter sensitivities show significant correlation with the catchment hydro-meteorological conditions, especially during the rainy period with MIC values larger than 0.5. Overall, the results in this paper indicate that uncertainty propagation and temporal sensitivities of parameters can be effectively characterized through the proposed HSDAPC approach.  相似文献   

10.
Manual calibration of distributed models with many unknown parameters can result in problems of equifinality and high uncertainty. In this study, the Generalized Likelihood Uncertainty Estimation (GLUE) technique was used to address these issues through uncertainty and sensitivity analysis of a distributed watershed scale model (SAHYSMOD) for predicting changes in the groundwater levels of the Rechna Doab basin, Pakistan. The study proposes and then describes a stepwise methodology for SAHYSMOD uncertainty analysis that has not been explored in any study before. One thousand input data files created through Monte Carlo simulations were classified as behavior and non-behavior sets using threshold likelihood values. The model was calibrated (1983–1988) and validated (1998–2003) through satisfactory agreement between simulated and observed data. Acceptable values were observed in the statistical performance indices. Approximately 70% of the observed groundwater level values fell within uncertainty bounds. Groundwater pumping (Gw) and hydraulic conductivity (Kaq) were found to be highly sensitive parameters affecting groundwater recharge.  相似文献   

11.
Parametrized surrogate models are used in alloy modeling to quickly obtain otherwise expensive properties such as quantum mechanical energies, and thereafter used to optimize, or simply compute, some alloy quantity of interest, e.g., a phase transition, subject to given constraints. Once learned on a data set, the surrogate can compute alloy properties fast, but with an increased uncertainty compared to the computer code. This uncertainty propagates to the quantity of interest and in this work we seek to quantify it. Furthermore, since the alloy property is expensive to compute, we only have available a limited amount of data from which the surrogate is to be learned. Thus, limited data further increases the uncertainties in the quantity of interest, and we show how to capture this as well. We cannot, and should not, trust the surrogate before we quantify the uncertainties in the application at hand. Therefore, in this work we develop a fully Bayesian framework for quantifying the uncertainties in alloy quantities of interest, originating from replacing the expensive computer code with the fast surrogate, and from limited data. We consider a particular surrogate popular in alloy modeling, the cluster expansion, and aim to quantify how well it captures quantum mechanical energies. Our framework is applicable to other surrogates and alloy properties.  相似文献   

12.
Air pollution in atmosphere derives from complex non-linear relationships, involving anthropogenic and biogenic precursor emissions. Due to this complexity, Decision Support Systems (DSSs) are important tools to help Environmental Authorities to control/improve air quality, reducing human and ecosystems pollution impacts. DSSs implementing cost-effective or multi-objective methodologies require fast air quality models, able to properly describe the relations between emissions and air quality indexes. These, namely surrogate models (SM), are identified processing deterministic model simulation data. In this work, the Lazy Learning technique has been applied to reproduce the relations linking precursor emissions and pollutant concentrations. Since computational time has to be minimized without losing precision and accuracy, tests aimed at reducing the amount of input data have been performed on a case study over Lombardia Region in Northern Italy.  相似文献   

13.
In this article we present our recent efforts in designing a comprehensive consistent scientific workflow, nicknamed Wolf2 Pack, for force-field optimization in the field of computational chemistry. Atomistic force fields represent a multiscale bridge that connects high-resolution quantum mechanics knowledge to coarser molecular mechanics-based models. Force-field optimization has so far been a time-consuming and error-prone process, and is a topic where the use of a scientific workflow can provide obvious great benefits. As a case study we generate a gas-phase force field for methanol using Wolf2 Pack, with special attention given toward deriving partial atomic charges.  相似文献   

14.
Problems characterized by qualitative uncertainty described by expert judgments can be addressed by the fuzzy logic modeling paradigm, structured within a so-called fuzzy expert system (FES) to handle and propagate the qualitative, linguistic assessments by the experts. Once constructed, the FES model should be verified to make sure that it represents correctly the experts’ knowledge. For FES verification, typically there is not enough data to support and compare directly the expert- and FES-inferred solutions. Thus, there is the necessity to develop indirect methods for determining whether the expert system model provides a proper representation of the expert knowledge. A possible way to proceed is to examine the importance of the different input factors in determining the output of the FES model and to verify whether it is in agreement with the expert conceptualization of the model. In this view, two sensitivity and uncertainty analysis techniques applicable to generic FES models are proposed in this paper with the objective of providing appropriate tools of verification in support of the experts in the FES design phase. To analyze the insights gained by using the proposed techniques, a case study concerning a FES developed in the field of human reliability analysis has been considered.  相似文献   

15.
Land change modelers often create future maps using reference land use map. However, future land use maps may mislead decision-makers, who are often unaware of the sensitivity and the uncertainty in land use maps due to error in data. Since most metrics that communicate uncertainty require using reference land use data to calculate accuracy, the assessment of uncertainty becomes challenging when no reference land use map for future is available. This study aims to develop a new conceptual framework for sensitivity analysis and uncertainty assessment (FSAUA) which compares multiple maps under various data error scenarios. FSAUA performs sensitivity analyses in land use maps using a reference map and assess uncertainty in predicted maps. FSAUA was applied using three well-known land change models (ANN, CART and MARS) in Delhi, India. FSAUA was found to be a practical tool for communicating the uncertainty with end-users who develop reliable planning decisions.  相似文献   

16.
Using family balance (i.e., combined net farm and non-farm incomes less family expenses), an output from an integrated model, which couples water resource, agronomic and socio-economic models, its sensitivity and uncertainty are evaluated for five smallholder farming groups (A–E) in the Olifants Basin. The crop management practiced included conventional rainfed, untied ridges, planting basins and supplemental irrigation. Scatter plots inferred the most sensitive variables affecting family balance, while the Monte Carlo method, using random sampling, was used to propagate the uncertainty in the model inputs to produce family balance probability distributions. A non-linear correlation between in-season rainfall and family balance arises from several factors that affect crop yield, indicating the complexity of farm family finance resource-base in relation to climate, crop management practices and environmental resources of soil and water. Stronger relationships between family balance and evapotranspiration than with in-season rainfall were obtained. Sensitivity analysis results suggest more targeted investment effort in data monitoring of yield, in-season rainfall, supplemental irrigation and maize price to reduce family balance uncertainty that varied from 42% to 54% at 90% confidence level. While supplemental irrigation offers the most marginal increase in yields, its wide adoption is limited by availability of water and infrastructure cost.  相似文献   

17.
This work presents a holistic ‘closed loop’ approach for the development of models of biological systems. The ever-increasing availability of experimental information necessitates the advancement of a systematic methodology to organise and utilise these data. Herein, we present a biological model building framework that maps the treatment of the information from the initial conception of the model, through its experimental validation and finally to its application in model-based optimisation studies. We highlight and discuss current issues associated with the development of mathematical models of biological systems and share our perspective towards a holistic ‘closed loop’ approach that will facilitate the control of the in vitro through the in silico.  相似文献   

18.
Variational method is applied to the state equations in order to derive the costate equations and their boundary conditions. Thereafter, the analyses of the eigenvalues of the state and costate equations are performed. It is shown that the eigenvalues of the Jacobean matrices of the state and the transposed Jacobean matrices of the costate equations are analytically and numerically the same. Based on the eigenvalue analysis, the costate equations with their boundary conditions are numerically integrated. Numerical results of the eigenvalues problems of the state and costate equations and of a maximization problem are finally presented.  相似文献   

19.
Optimal Latin Hypercubes (OLH) created in a constrained design space might produce Design of Experiments (DoE) containing infeasible points if the underlying formulation disregards the constraints. Simply omitting these infeasible points leads to a DoE with fewer experiments than desired and to a set of points that is not optimally distributed. By using the same number of points a better mapping of the feasible space can be achieved. This paper describes the development of a procedure that creates OLHs for constrained design spaces. An existing formulation is extended to meet this requirement. Here, the OLH is found by minimizing the Audze-Eglais potential energy of the points using a permutation genetic algorithm. Examples validate the procedure and demonstrate its capabilities in finding space-filling Latin Hypercubes in arbitrarily shaped design spaces.  相似文献   

20.
Absolute deviation is a commonly used risk measure, which has attracted more attentions in portfolio optimization. The existing mean-absolute deviation models are devoted to either stochastic portfolio optimization or fuzzy one. However, practical investment decision problems often involve the mixture of randomness and fuzziness such as stochastic returns with fuzzy information. Thus it is necessary to model portfolio selection problem in such a hybrid uncertain environment. In this paper, we employ random fuzzy variable to describe the stochastic return on individual security with ambiguous information. We first define the absolute deviation of random fuzzy variable and then employ it as risk measure to formulate mean-absolute deviation portfolio optimization models. To find the optimal portfolio, we design random fuzzy simulation and simulation-based genetic algorithm to solve the proposed models. Finally, a numerical example for synthetic data is presented to illustrate the validity of the method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号