首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A mixture experiment is characterized by having two or more inputs that are specified as a percentage contribution to a total amount of material. In such situations, the input variables are correlated because they must sum to one. Consequently, additional care must be taken when fitting statistical models or visualizing the effect of one or more inputs on the response. In this article, we consider the use of a Gaussian process to model the output from a computer simulator taking a mixture input. We introduce a procedure to perform global sensitivity analysis of the code output providing main effects and revealing interactions. The resulting methodology is illustrated using a function with analytically tractable results for comparison, a chemical compositional simulator, and a physical experiment. Supplementary materials providing assistance with implementing this methodology are available online.  相似文献   

2.
Complex natural phenomena are increasingly investigated by the use of a complex computer simulator. To leverage the advantages of simulators, observational data need to be incorporated in a probabilistic framework so that uncertainties can be quantified. A popular framework for such experiments is the statistical computer model calibration experiment. A limitation often encountered in current statistical approaches for such experiments is the difficulty in modeling high-dimensional observational datasets and simulator outputs as well as high-dimensional inputs. As the complexity of simulators seems to only grow, this challenge will continue unabated. In this article, we develop a Bayesian statistical calibration approach that is ideally suited for such challenging calibration problems. Our approach leverages recent ideas from Bayesian additive regression Tree models to construct a random basis representation of the simulator outputs and observational data. The approach can flexibly handle high-dimensional datasets, high-dimensional simulator inputs, and calibration parameters while quantifying important sources of uncertainty in the resulting inference. We demonstrate our methodology on a CO2 emissions rate calibration problem, and on a complex simulator of subterranean radionuclide dispersion, which simulates the spatial–temporal diffusion of radionuclides released during nuclear bomb tests at the Nevada Test Site. Supplementary computer code and datasets are available online.  相似文献   

3.
ISO/DIS 7870 has presented the cumulative sum chart, the moving average chart, and the exponentially weighted moving average chart as control charts using accumulated data. In this paper, we compare the three control charts in terms of change-point estimation. We show the probability distribution, the bias and the mean square error of the change-point estimators using a Markov process and Monte Carlo simulation. These control charts have almost equivalent performances based on average run length considerations when parameters of each control chart are set appropriately. However, from the viewpoint of change-point estimation we recommend the CUSUM chart.  相似文献   

4.
In optimization under uncertainty for engineering design, the behavior of the system outputs due to uncertain inputs needs to be quantified at each optimization iteration, but this can be computationally expensive. Multifidelity techniques can significantly reduce the computational cost of Monte Carlo sampling methods for quantifying the effect of uncertain inputs, but existing multifidelity techniques in this context apply only to Monte Carlo estimators that can be expressed as a sample average, such as estimators of statistical moments. Information reuse is a particular multifidelity method that treats previous optimization iterations as lower fidelity models. This work generalizes information reuse to be applicable to quantities whose estimators are not sample averages. The extension makes use of bootstrapping to estimate the error of estimators and the covariance between estimators at different fidelities. Specifically, the horsetail matching metric and quantile function are considered as quantities whose estimators are not sample averages. In an optimization under uncertainty for an acoustic horn design problem, generalized information reuse demonstrated computational savings of over 60% compared with regular Monte Carlo sampling.  相似文献   

5.
The two‐parameter Weibull distribution is one of the most widely applied probability distributions, particularly in reliability and lifetime modelings. Correct estimation of the shape parameter of the Weibull distribution plays a central role in these areas of statistical analysis. Many different methods can be used to estimate this parameter, most of which utilize regression methods. In this paper, we presented various regression methods for estimating the Weibull shape parameter and an experimental study using classical regression methods to compare the results of the methods. A complete list of the parameter estimators considered in this study is as follows: ordinary least squares (OLS), weighted least squares (WLS, Bergman, F&T, Lu), non‐parametric robust Theil's (Theil) and weighted Theil's (WeTheil), robust Winsorized least squares (WinLS), and M‐estimators (Huber, Andrew, Tukey, Cauchy, Welsch, Hampel and Logistic). Estimator performances were compared based on bias and mean square error criteria using Monte‐Carlo simulations. The simulation results demonstrated that for small, complete, and non‐outlier data sets, the Bergman, F&T, and Lu estimators are more efficient than the others. When the data set contains one or two outliers in the X direction, Theil is the most efficient estimator. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
Computer simulators of real-world processes are often computationally expensive and require many inputs. The problem of the computational expense can be handled using emulation technology; however, highly multidimensional input spaces may require more simulator runs to train and validate the emulator. We aim to reduce the dimensionality of the problem by screening the simulator’s inputs for nonlinear effects on the output rather than distinguishing between negligible and active effects. Our proposed method is built upon the elementary effects (EE) method for screening and uses a threshold value to separate the inputs with linear and nonlinear effects. The technique is simple to implement and acts in a sequential way to keep the number of simulator runs down to a minimum, while identifying the inputs that have nonlinear effects. The algorithm is applied on a set of simulated examples and a rabies disease simulator where we observe run savings ranging between 28% and 63% compared with the batch EE method. Supplementary materials for this article are available online.  相似文献   

7.
Large computer simulators have usually complex and nonlinear input output functions. This complicated input output relation can be analyzed by global sensitivity analysis; however, this usually requires massive Monte Carlo simulations. To effectively reduce the number of simulations, statistical techniques such as Gaussian process emulators can be adopted. The accuracy and reliability of these emulators strongly depend on the experimental design where suitable evaluation points are selected. In this paper a new sequential design strategy called hierarchical adaptive design is proposed to obtain an accurate emulator using the least possible number of simulations. The hierarchical design proposed in this paper is tested on various standard analytic functions and on a challenging reservoir forecasting application. Comparisons with standard one-stage designs such as maximin latin hypercube designs show that the hierarchical adaptive design produces a more accurate emulator with the same number of computer experiments. Moreover a stopping criterion is proposed that enables to perform the number of simulations necessary to obtain required approximation accuracy.  相似文献   

8.
In this paper, we present and demonstrate a methodology to improve probabilistic fatigue crack growth (FCG) predictions by using the concept of Bayesian updating using Markov chain Monte Carlo simulations. The methodology is demonstrated on a cracked pipe undergoing fatigue loading. Initial estimates of the FCG rate are made using the Paris law. The prior probability distributions of the Paris law parameters are taken from the tests on specimen made of the same material as that of pipe. Measured data on crack depth over number of loading cycles are used to update the prior distribution using the Markov chain Monte Carlo. The confidence interval on the predicted FCG rate is also estimated. In actual piping placed in a plant, the measured data can be considered equivalent to the data received from in-service inspection. It is shown that the proposed methodology improves the fatigue life prediction. The number of observations used for updating is found to leave a significant effect on the accuracy of the updated prediction.  相似文献   

9.
This paper presents a novel Monte Carlo method (WeLMoS, Weighted Likelihood Monte-Carlo sampling method) that has been developed to perform Bayesian analyses of monitoring data. The WeLMoS method randomly samples parameters from continuous prior probability distributions and then weights each vector by its likelihood (i.e. its goodness of fit to the measurement data). Furthermore, in order to quality assure the method, and assess its strengths and weaknesses, a second method (MCMC, Markov chain Monte Carlo) has also been developed. The MCMC method uses the Metropolis algorithm to sample directly from the posterior distribution of parameters. The methods are evaluated and compared using an artificially generated case involving an exposure to a plutonium nitrate aerosol. In addition to calculating the uncertainty on internal dose, the methods can also calculate the probability distribution of model parameter values given the observed data. In other words, the techniques provide a powerful tool to obtain the estimates of parameter values that best fit the data and the associated uncertainty on these estimates. Current applications of the methodology, including the determination of lung solubility parameters, from volunteer and cohort data, are also discussed.  相似文献   

10.
An extended finite element method (XFEM) coupled with a Monte Carlo approach is proposed to quantify the uncertainty in the homogenized effective elastic properties of multiphase materials. The methodology allows for an arbitrary number, aspect ratio, location and orientation of elliptic inclusions within a matrix, without the need for fine meshes in the vicinity of tightly packed inclusions and especially without the need to remesh for every different generated realization of the microstructure. Moreover, the number of degrees of freedom in the enriched elements is dynamically reallocated for each Monte Carlo sample run based on the given volume fraction. The main advantage of the proposed XFEM‐based methodology is a major reduction in the computational effort in extensive Monte Carlo simulations compared with the standard FEM approach. Monte Carlo and XFEM appear to work extremely efficiently together. The Monte Carlo approach allows for the modeling of the size, aspect ratios, orientations, and spatial distribution of the elliptical inclusions as random variables with any prescribed probability distributions. Numerical results are presented and the uncertainty of the homogenized elastic properties is discussed. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

11.
The 2–parameter Weibull model can be re–parameterized in terms of shape parameter and a prefixed age for which a reliability estimation is required. Using the uniform and beta priors, some new point and lower bound reliability estimators were derived. These estimators appear to be very suitable to meet the needs of fast and cheap reliability evaluation during production. Their characteristics were studied by means of a Monte Carlo simulation and were compared with those of the maximum likelihood estimators.  相似文献   

12.
This paper explores the possibilities of numerical methods for uncertainty analysis of personal dosimetry systems. Using a numerical method based on Monte Carlo sampling the probability density function (PDF) of the dose measured using a personal dosemeter can be calculated using type-test measurements. From this PDF the combined standard uncertainty in the measurements with the dosemeter and the confidence interval can be calculated. The method calculates the output PDF directly from the PDFs of the inputs of the system such as the spectral distribution of the radiation and distributions of detector parameters like sensitivity and zero signal. The method can be used not only in its own right but also for validating other methods because it is not limited by restrictions that apply to using the Law of Propagation of Uncertainty and the Central Limit Theorem. The use of the method is demonstrated using the type-test data of the NRG-TLD.  相似文献   

13.
Monte Carlo methods provide a powerful technique for estimating the average radiation flux in a volume (or across a surface) in cases where analytical solutions may not be possible. Unfortunately, Monte Carlo simulations typically provide only integral results and do not offer any further details about the distribution of the flux with respect to space, angle, time or energy. In the functional expansion tally (FET) a Monte Carlo simulation is used to estimate the functional expansion coefficients for flux distributions with respect to an orthogonal set of basis functions. The expansion coefficients are then used in post-processing to reconstruct a series approximation to the true distribution. Discrete event FET estimators are derived and their application in estimating radiation flux or current distributions is demonstrated. Sources of uncertainty in the FET are quantified and estimators for the statistical and truncation errors are derived. Numerical results are presented to support the theoretical development.  相似文献   

14.
In this study, we have considered two design structures of control chart by covering the situations of known and unknown parameters, variety of probability distributions, and runs rules. The design structures are dependent on constants which generally considered hard to compute analytically. For construction of constants and also for evaluating performance of the design structures through performance measures, we have illustrated Monte Carlo simulation procedure/algorithm for researcher and practitioners. Furthermore, based on the Monte Carlo simulation procedures, we have established a program in R language to compute values of different constants and performance measures. Results illustrated that design structures for known and unknown parameters under variety of runs rules and probability distributions have outstanding performance in contrast to existing structures. Moreover, design structure for unknown parameters behaves alike the design structure for known parameters. This indicates that design structure for unknown parameters has the ability to resolve the issue of runs rules which generally occur when parameters are estimated. Besides, two real‐life examples have been included in which physicochemical characteristic of groundwater and plasticizer characteristic of petrochemical process are monitored through design structures.  相似文献   

15.
Mayo and Gray [Am Statist 51 (1997) 122] introduced the leverage-residual weighted elemental (LRWE) classification of regression estimators and proposed a new method of estimation called trimmed elemental estimation (TEE). In this article, we perform a simulation study of the efficiency of certain TEE estimators relative to ordinary least squares under normal errors and their robustness under various non-normal error distributions in the context of the simple linear regression model. Comparisons among these estimators are made on the basis of mean square error and percentiles of the absolute estimation errors in the simulations.  相似文献   

16.
董现  王湛 《工程力学》2015,32(12):49-57
针对不确定性参数对结构力学性能的随机影响,该文利用混合神经网络良好的小样本学习和泛化能力构建结构响应复杂的函数关系,采用改进的混沌粒子群算法优化网络寻址结构。结合蒙特卡洛法对结构进行随机性分析,并根据该文提出的新的灵敏度度量参数计算随机变量的全局灵敏度系数。通过数学算例和工程算例验证了所提方法的可行性,且结构响应的概率分布曲线也可以真实的反应实际情况。同时,利用该文所提出的随机灵敏度计算方法可以更好的反应各随机变量对结构响应的相关性和敏感性。  相似文献   

17.
The EWMA chart is effective in detecting small shifts in the process mean or process variance. Numerous EWMA charts for the process variance have been suggested in the literature. In this article, new one-sided and two-sided EWMA charts are developed for monitoring the variance of a normal process. In developing these new EWMA charts, first, new unbiased estimators of the process variance are developed, followed by incorporating the developed estimators into the new EWMA charts' statistics. The Monte Carlo simulation method is adopted to evaluate the zero-state and steady-state run-length performances of the proposed EWMA variance charts, in comparison with that of three existing EWMA variance charts and the weighted adaptive CUSUM variance chart. The findings reveal that the proposed charts generally perform better than the existing charts. An example of application is given to show the implementation of the proposed and existing charts in detecting increases or decreases in the process variance.  相似文献   

18.
陈超  吕震宙 《工程力学》2016,33(2):25-33
为合理度量随机输入变量分布参数的模糊性对输出性能统计特征的影响,提出了模糊分布参数的全局灵敏度效应指标,并研究了指标的高效求解方法。首先,分析了不确定性从模糊分布参数至模型输出响应统计特征的传递机理,以输出性能期望响应为例,利用输出均值的无条件隶属函数与给定模糊分布参数取值条件下的隶属函数的平均差异来度量模糊分布参数的影响,建立了模糊分布参数的全局灵敏度效应指标。其次,为减少所提指标的计算成本、提高计算效率,采用了扩展蒙特卡罗模拟法(EMCS)来估算输入变量分布参数与模型输出响应统计特征的函数关系。最后通过对算例的计算,验证该文所提方法的准确性和高效性。  相似文献   

19.
This paper develops a novel failure probability-based global sensitivity index by introducing the Bayes formula into the moment-independent global sensitivity index to approximate the effect of input random variables or stochastic processes on the time-variant reliability. The proposed global sensitivity index can estimate the effect of uncertain inputs on the time-variant reliability by comparing the difference between the unconditional probability density function of input variables and the conditional probability density function in failure state of input variables. Furthermore, a single-loop active learning Kriging method combined with metamodel-based importance sampling is employed to improve the computational efficiency. The accuracy of the results obtained by Kriging model is verified by the reference results provided by the Monte Carlo simulation. Four examples are investigated to demonstrate the significance of the proposed failure probability-based global sensitivity index and the effectiveness of the computational method.  相似文献   

20.
Nonparametric point and interval reliability estimators are obtained which imply a continuous underlying time-to-failure density function. These estimators are simple to use and involve cumulative normal probabilities. Sample calculations are presented to illustrate their use. Using the point estimator, performance comparisons are made with two standard estimators by means of Monte Carlo simulation in the two-parameter Weibull family of distributions. In the Weibull reliability region above 90 percent, the estimator generally outperforms the standard estimators for nearly all Weibull functions. The point estimator performs exceptionally well in the reliability region above 99 percent for many Weibull functions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号