首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Life cycle assessment (LCA) calculates the environmental impact of a product over its entire life cycle. Uncertainty analysis is an important aspect in LCA, and is usually performed using Monte Carlo sampling. In this study, Monte Carlo sampling, Latin hypercube sampling, quasi Monte Carlo sampling, analytical uncertainty propagation and fuzzy interval arithmetic were compared based on e.g. convergence rate and output statistics. Each method was tested on three LCA case studies, which differed in size and behaviour. Uncertainty propagation in LCA using a sampling method leads to more (directly) usable information compared to fuzzy interval arithmetic or analytical uncertainty propagation. Latin hypercube and quasi Monte Carlo sampling provide more accuracy in determining the sample mean than Monte Carlo sampling and can even converge faster than Monte Carlo sampling for some of the case studies discussed in this paper.  相似文献   

2.
形位误差的测量不确定度评定是目前测量领域研究的热点;但由于其测量的复杂性和测量结果评定的多样性,导致在实际测量结果中形位误差测量的不确定度评定成了难题;为此,根据形状误差评定准则,选取最小二乘法建立数学模型,确定形状误差数学模型中各参数值的传递系数和单点不确定度,并分析具体的测量方法和测量过程中的不确定度来源,根据传统的GUM法对其进行不确定度评定;然后采用蒙特卡罗伪随机数的方法来模拟实际测量数据,从而得到平面度误差的不确定度;通过设置实验对比,验证了蒙特卡罗法评定平面度不确定度的可靠性和准确性;该方法不需要求出数学模型中的传递系数,利用MATLAB软件很容易实现,为平面度误差测量结果不确定度评定提供了更加简便的方法,值得推广和应用。  相似文献   

3.
现行对发动机试验推力测量不确定度的评估一直采用GUM法,存在输入量和输出量概率分布假设以及非线性模型近似等问题,有一定的局限性;以发动机推力矢量测量为例,文中简述了压电式推力矢量测量的数学模型,运用GUM法对推力矢量参数的不确定度评估,同时分析了蒙特卡洛法的原理、具体评估过程和适用性,编制了软件,并将不确定度评估结果与GUM法评估结果进行对比;对比结果表明,在发动机试验推力矢量参数的不确定度评估过程中,蒙特卡洛法相比GUM法更为适用.  相似文献   

4.
A new model for the environmental assessment of environmental technologies, EASETECH, has been developed. The primary aim of EASETECH is to perform life-cycle assessment (LCA) of complex systems handling heterogeneous material flows. The objectives of this paper are to describe the EASETECH framework and the calculation structure. The main novelties compared to other LCA software are as follows. First, the focus is put on material flow modelling, as each flow is characterised as a mix of material fractions with different properties and flow compositions are computed as a basis for the LCA calculations. Second, the tool has been designed to allow for the easy set-up of scenarios by using a toolbox, the processes within which can handle heterogeneous material flows in different ways and have different emission calculations. Finally, tools for uncertainty analysis are provided, enabling the user to parameterise systems fully and propagate probability distributions through Monte Carlo analysis.  相似文献   

5.
Surrogate models are widely used to predict response function of any system and in quantifying uncertainty associated with the response function. It is required to have response quantities at some preselected sample points to construct a surrogate model which can be processed in two way. Either the surrogate model is constructed using one shot experimental design techniques, or, the sample points can be generated in a sequential manner so that optimum sample points for a specific problem can be obtained. This paper addresses a comprehensive comparisons between these two types of sampling techniques for the construction of more accurate surrogate models. Two most popular one shot sampling strategies: Latin hypercube sampling and Sobol sequence, and four different type sequential experimental designs (SED) namely, Monte Carlo intersite projected (MCIP), Monte Carlo intersite projected threshold (MCIPT), optimizer projected (OP) and LOLA-Voronoi (LV) method are taken for the present study. Two most widely used surrogate models, namely polynomial chaos expansion and Kriging are used to check the applicability of the experimental design techniques. Three different types of numerical problems are solved using the two above-mentioned surrogate models considering all the experimental design techniques independently. Further, all the results are compared with the standard Monte Carlo simulation (MCS). Overall study depicts that SED performs well in predicting the response functions more accurately with an acceptable number of sample points even for high-dimensional problems which maintains the balance between accuracy and efficiency. More specifically, MCIPT and LV method based Kriging outperforms other combinations.  相似文献   

6.
Model-based reliability analysis is affected by different types of epistemic uncertainty, due to inadequate data and modeling errors. When the physics-based simulation model is computationally expensive, a surrogate has often been used in reliability analysis, introducing additional uncertainty due to the surrogate. This paper proposes a framework to include statistical uncertainty and model uncertainty in surrogate-based reliability analysis. Two types of surrogates have been considered: (1) general-purpose surrogate models that compute the system model output over the desired ranges of the random variables; and (2) limit-state surrogates. A unified approach to connect the model calibration analysis using the Kennedy and O’Hagan (KOH) framework to the construction of limit state surrogate and to estimating the uncertainty in reliability analysis is developed. The Gaussian Process (GP) general-purpose surrogate of the physics-based simulation model obtained from the KOH calibration analysis is further refined at the limit state (local refinement) to construct the limit state surrogate, which is used for reliability analysis. An efficient single-loop sampling approach using the probability integral transform is used for sampling the input variables with statistical uncertainty. The variability in the GP prediction (surrogate uncertainty) is included in reliability analysis through correlated sampling of the model predictions at different inputs. The Monte Carlo sampling (MCS) error, which represents the error due to limited Monte Carlo samples, is quantified by constructing a probability density function. All the different sources of epistemic uncertainty are quantified and aggregated to estimate the uncertainty in the reliability analysis. Two examples are used to demonstrate the proposed techniques.  相似文献   

7.
Parameter distribution estimation has long been a hot issue for the uncertainty quantification of environmental models. Traditional approaches such as MCMC (Markov Chain Monte Carlo) are prohibitive to be applied to large complex dynamic models because of the high computational cost of computing resources. To reduce the number of model evaluations required, we proposed an adaptive surrogate modeling-based sampling strategy for parameter distribution estimation, named ASMO-PODE (Adaptive Surrogate Modeling-based Optimization – Parameter Optimization and Distribution Estimation). The ASMO-PODE can provide an estimation of the parameter distribution using as little as one percent of the model evaluations required by a regular MCMC approach. The effectiveness and efficiency of the ASMO-PODE approach have been evaluated with 2 test problems and one land surface model, the Common Land Model. The results demonstrated that the ASMO-PODE method is an economic way for parameter optimization and distribution estimation.  相似文献   

8.
Nowadays, most of the mathematical models used in predictive microbiology are deterministic, i.e. their model output is only one single value for the microbial load at a certain time instant. For more advanced exploitation of predictive microbiology in the context of hazard analysis and critical control points (HACCP) and risk analysis studies, stochastic models should be developed. Such models predict a probability mass function for the microbial load at a certain time instant. An excellent method to deal with stochastic variables is Monte Carlo analysis. In this research, the sensitivity of microbial growth model parameter distributions with respect to data quality and quantity is investigated using Monte Carlo analysis. The proposed approach is illustrated with experimental growth data. There appears to be a linear relation between data quality (expressed by means of the standard deviation of the normal distribution assumed on experimental data) and model parameter uncertainty (expressed by means of the standard deviation of the model parameter distribution). The quantity of data (expressed by means of the number of experimental data points) as well as the positioning of these data in time have a substantial influence on model parameter uncertainty. This has implications for optimal experiment design.  相似文献   

9.
张保强  陈梅玲  孙东阳  锁斌 《控制与决策》2020,35(10):2459-2465
针对时变系统的不确定性量化和传递问题,提出一种概率盒演化方法.根据系统的时变规律,获取系统响应的累积分布函数随时间变化的规律.将认知不确定性参数和随机不确定性参数分离在外层和内层,用蒙特卡洛法量化外层的认知不确定性参数,用基于随机配点的非嵌入式混沌多项式法量化内层的随机不确定性参数,通过求取不同时刻系统响应的累积分布函数的上下边界创建时变概率盒.最后,通过一延时电路性能退化算例验证所提出方法的有效性.研究表明,时变概率盒不仅可以表征系统特定时刻的混合不确定性,而且反映了输出响应的时变规律和输出不确定性随时间变化的趋势.  相似文献   

10.
Systematic testing of integrated systems models is extremely important but its difficulty is widely underestimated. The inherent complexity of the integrated systems models, the philosophical debate about the model validity and validation, the uncertainty in model inputs, parameters and future context and the scarcity of field data complicate model validation. This calls for a validation framework and procedures which can identify the strengths and weaknesses of the model with the available data from observations, the literature and experts’ opinions. This paper presents such a framework and the respective procedure. Three tests, namely, Parameter-Verification, Behaviour-Anomaly and Policy-Sensitivity are selected to test a Rapid assessment Model for Coastal-zone Management (RaMCo). The Morris sensitivity analysis, a simple expert elicitation technique and Monte Carlo uncertainty analysis are used to facilitate these three tests. The usefulness of the procedure is demonstrated for two examples.  相似文献   

11.
In this work, we discuss practical methods for the assessment, comparison, and selection of complex hierarchical Bayesian models. A natural way to assess the goodness of the model is to estimate its future predictive capability by estimating expected utilities. Instead of just making a point estimate, it is important to obtain the distribution of the expected utility estimate because it describes the uncertainty in the estimate. The distributions of the expected utility estimates can also be used to compare models, for example, by computing the probability of one model having a better expected utility than some other model. We propose an approach using cross-validation predictive densities to obtain expected utility estimates and Bayesian bootstrap to obtain samples from their distributions. We also discuss the probabilistic assumptions made and properties of two practical cross-validation methods, importance sampling and k-fold cross-validation. As illustrative examples, we use multilayer perceptron neural networks and gaussian processes with Markov chain Monte Carlo sampling in one toy problem and two challenging real-world problems.  相似文献   

12.
Probabilistic model checking has been used recently to assess, among others, dependability measures for a variety of systems. However, the numerical methods employed, such as those supported by model checking tools such as PRISM and MRMC, suffer from the state-space explosion problem. The main alternative is statistical model checking, which uses standard Monte Carlo simulation, but this performs poorly when small probabilities need to be estimated. Therefore, we propose a method based on importance sampling to speed up the simulation process in cases where the failure probabilities are small due to the high speed of the system’s repair units. This setting arises naturally in Markovian models of highly dependable systems. We show that our method compares favourably to standard simulation, to existing importance sampling techniques, and to the numerical techniques of PRISM.  相似文献   

13.
Importance analysis is aimed at finding the contributions by the inputs to the uncertainty in a model output. For structural systems involving inputs with distribution parameter uncertainty, the contributions by the inputs to the output uncertainty are governed by both the variability and parameter uncertainty in their probability distributions. A natural and consistent way to arrive at importance analysis results in such cases would be a three-loop nested Monte Carlo (MC) sampling strategy, in which the parameters are sampled in the outer loop and the inputs are sampled in the inner nested double-loop. However, the computational effort of this procedure is often prohibitive for engineering problem. This paper, therefore, proposes a newly efficient algorithm for importance analysis of the inputs in the presence of parameter uncertainty. By introducing a ‘surrogate sampling probability density function (SS-PDF)’ and incorporating the single-loop MC theory into the computation, the proposed algorithm can reduce the original three-loop nested MC computation into a single-loop one in terms of model evaluation, which requires substantially less computational effort. Methods for choosing proper SS-PDF are also discussed in the paper. The efficiency and robustness of the proposed algorithm have been demonstrated by results of several examples.  相似文献   

14.
15.
Parameter estimation for agent-based and individual-based models (ABMs/IBMs) is often performed by manual tuning and model uncertainty assessment is often ignored. Bayesian inference can jointly address these issues. However, due to high computational requirements of these models and technical difficulties in applying Bayesian inference to stochastic models, the exploration of its application to ABMs/IBMs has just started. We demonstrate the feasibility of Bayesian inference for ABMs/IBMs with a Particle Markov Chain Monte Carlo (PMCMC) algorithm developed for state-space models. The algorithm profits from the model's hidden Markov structure by jointly estimating system states and the marginal likelihood of the parameters using time-series observations. The PMCMC algorithm performed well when tested on a simple predator-prey IBM using artificial observation data. Hence, it offers the possibility for Bayesian inference for ABMs/IBMs. This can yield additional insights into model behaviour and uncertainty and extend the usefulness of ABMs/IBMs in ecological and environmental research.  相似文献   

16.
多智能体决策问题是人工智能领域的研究热点.与单智能体决策问题相比,多智能体决策的策略搜索空间更大.分布式局部感知马尔可夫决策过程(Dec-POMDPs)建立了不确定环境下多智能体决策问题的通用模型,自提出以来受到很大关注,但是求解Dec-POMDPs问题计算复杂度高,内存占用大.基于此,提出一种新的Q值函数表示-----蒙特卡洛Q值函数$(Q_MC)$,并从理论上证明$Q_MC$是最优Q值函数$Q^\ast$的上界,能够保证启发式搜索到最优解;运用自适应抽样方法,平衡收敛准确性和求解时间的关系;结合启发式搜索的精确性和蒙特卡洛方法随机抽样的一般性,提出一种基于$Q_MC$的蒙特卡洛聚类/扩展算法(CEMC),CEMC整合了Q值函数求解和策略搜索过程,避免保存所有值函数,只按需求解.实验结果表明,CEMC在时间和内存占用上超过目前性能最好的使用紧凑Q值函数的启发式方法.  相似文献   

17.
A sparse seemingly unrelated regression (SSUR) model is proposed to generate substantively relevant structures in the high-dimensional distributions of seemingly unrelated regression (SUR) model parameters. The SSUR framework includes prior specifications, posterior computations using Markov chain Monte Carlo methods, evaluations of model uncertainty, and model structure searches. Extensions of the SSUR model to dynamic models embed general structure constraints and model uncertainty in dynamic models. The models represent specific varieties of models recently developed in the growing high-dimensional sparse modelling literature. Two simulated examples illustrate the model and highlight questions regarding model uncertainty, searching, and comparison. The model is then applied to two real-world examples in macroeconomics and finance, according to which its identified structures have practical significance.  相似文献   

18.
Data augmentation and parameter expansion can lead to improved iterative sampling algorithms for Markov chain Monte Carlo (MCMC). Data augmentation allows for simpler and more feasible simulation from a posterior distribution. Parameter expansion accelerates convergence of iterative sampling algorithms by increasing the parameter space. Data augmentation and parameter-expanded data augmentation MCMC algorithms are proposed for fitting probit models for independent ordinal response data. The algorithms are extended for fitting probit linear mixed models for spatially correlated ordinal data. The effectiveness of data augmentation and parameter-expanded data augmentation is illustrated using the probit model and ordinal response data, however, the approach can be used broadly across model and data types.  相似文献   

19.
The aim of this paper is to study the topology optimization for mechanical systems with hybrid material and geometric uncertainties. The random variations are modeled by a memory-less transformation of random fields which ensures their physical admissibility. The stochastic collocation method combined with the proposed material and geometry uncertainty models provides robust designs by utilizing already developed deterministic solvers. The computational cost is decreased by using of sparse grids and discretization refinement that are proposed and demonstrated as well. The method is utilized in the design of minimum compliance structure. The proposed algorithm provides a computationally cheap alternative to previously introduced stochastic optimization methods based on Monte Carlo sampling by using adaptive sparse grids method.  相似文献   

20.
A powerful and flexible method for fitting dynamic models to missing and censored data is to use the Bayesian paradigm via data-augmented Markov chain Monte Carlo (DA-MCMC). This samples from the joint posterior for the parameters and missing data, but requires high memory overheads for large-scale systems. In addition, designing efficient proposal distributions for the missing data is typically challenging. Pseudo-marginal methods instead integrate across the missing data using a Monte Carlo estimate for the likelihood, generated from multiple independent simulations from the model. These techniques can avoid the high memory requirements of DA-MCMC, and under certain conditions produce the exact marginal posterior distribution for parameters. A novel method is presented for implementing importance sampling for dynamic epidemic models, by conditioning the simulations on sets of validity criteria (based on the model structure) as well as the observed data. The flexibility of these techniques is illustrated using both removal time and final size data from an outbreak of smallpox. It is shown that these approaches can circumvent the need for reversible-jump MCMC, and can allow inference in situations where DA-MCMC is impossible due to computationally infeasible likelihoods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号