首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Gaussian processes have become a standard framework for modeling deterministic computer simulations and producing predictions of the response surface. This article investigates a new covariance function that is shown to offer superior prediction compared to the more common covariances for computer simulations of real physical systems. This is demonstrated via a gamut of realistic examples. A simple, closed-form expression for the covariance is derived as a limiting form of a Brownian-like covariance model as it is extended to some hypothetical higher-dimensional input domain, and so we term it a lifted Brownian covariance. This covariance has connections with the multiquadric kernel. Through analysis of the kriging model, this article offers some theoretical comparisons between the proposed covariance model and existing covariance models. The major emphasis of the theory is explaining why the proposed covariance is superior to its traditional counterparts for many computer simulations of real physical systems. Supplementary materials for this article are available online.  相似文献   

2.
The spatial random effects model is flexible in modeling spatial covariance functions and is computationally efficient for spatial prediction via fixed rank kriging (FRK). However, the model depends on a class of basis functions, which if not selected properly, may result in unstable or undesirable results. Additionally, the maximum likelihood (ML) estimates of the model parameters are commonly computed using an expectation-maximization (EM) algorithm, which further limits its applicability when a large number of basis functions are required. In this research, we propose a class of basis functions extracted from thin-plate splines. The functions are ordered in terms of their degrees of smoothness with higher-order functions corresponding to larger-scale features and lower-order ones corresponding to smaller-scale details, leading to a parsimonious representation of a (nonstationary) spatial covariance function with the number of basis functions playing the role of spatial resolution. The proposed class of basis functions avoids the difficult knot-allocation or scale-selection problem. In addition, we show that ML estimates of the random effects covariance matrix can be expressed in simple closed forms, and hence the resulting FRK can accommodate a much larger number of basis functions without numerical difficulties. Finally, we propose to select the number of basis functions using Akaike’s information criterion, which also possesses a simple closed-form expression. The whole procedure, involving no additional tuning parameter, is efficient to compute, easy to program, automatic to implement, and applicable to massive amounts of spatial data even when they are sparsely and irregularly located. Proofs of the theorems and an R package autoFRK are provided in supplementary materials available online.  相似文献   

3.
The construction of decision-theoretical Bayesian designs for realistically complex nonlinear models is computationally challenging, as it requires the optimization of analytically intractable expected utility functions over high-dimensional design spaces. We provide the most general solution to date for this problem through a novel approximate coordinate exchange algorithm. This methodology uses a Gaussian process emulator to approximate the expected utility as a function of a single design coordinate in a series of conditional optimization steps. It has flexibility to address problems for any choice of utility function and for a wide range of statistical models with different numbers of variables, numbers of runs and randomization restrictions. In contrast to existing approaches to Bayesian design, the method can find multi-variable designs in large numbers of runs without resorting to asymptotic approximations to the posterior distribution or expected utility. The methodology is demonstrated on a variety of challenging examples of practical importance, including design for pharmacokinetic models and design for mixed models with discrete data. For many of these models, Bayesian designs are not currently available. Comparisons are made to results from the literature, and to designs obtained from asymptotic approximations. Supplementary materials for this article are available online.  相似文献   

4.
Metamodels based on responses from designed (numerical) experiments may form efficient approximations to functions in structural analysis. They can improve the efficiency of Engineering Optimization substantially by uncoupling computationally expensive analysis models and (iterative) optimization procedures. In this paper we focus on two strategies for building metamodels, namely Response Surface Methods (RSM) and kriging. We discuss key-concepts for both approaches, present strategies for model training and indicate ways to enhance these metamodeling approaches by including design sensitivity data. The latter may be advantageous in situations where information on design sensitivities is readily available, as is the case with e.g. Finite Element Models. Furthermore, we illustrate the use of RSM and kriging in a numerical model study and conclude with some remarks on their practical value.  相似文献   

5.
The purpose of model calibration is to make the model predictions closer to reality. The classical Kennedy–O’Hagan approach is widely used for model calibration, which can account for the inadequacy of the computer model while simultaneously estimating the unknown calibration parameters. In many applications, the phenomenon of censoring occurs when the exact outcome of the physical experiment is not observed, but is only known to fall within a certain region. In such cases, the Kennedy–O’Hagan approach cannot be used directly, and we propose a method to incorporate the censoring information when performing model calibration. The method is applied to study the compression phenomenon of liquid inside a bottle. The results show significant improvement over the traditional calibration methods, especially when the number of censored observations is large. Supplementary materials for this article are available online.  相似文献   

6.
The Gaussian process (GP) model provides a powerful methodology for calibrating a computer model in the presence of model uncertainties. However, if the data contain systematic experimental errors, then the GP model can lead to an unnecessarily complex adjustment of the computer model. In this work, we introduce an adjustment procedure that brings the computer model closer to the data by making minimal changes to it. This is achieved by applying a lasso-based variable selection on the systematic experimental error terms while fitting the GP model. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. This article has supplementary material available online.  相似文献   

7.
In the past two decades, more and more quality and reliability activities have been moving into the design of product and process. The design and analysis of computer experiments, as a new frontier of the design of experiments, has become increasingly popular among modern companies for optimizing product and process conditions and producing high‐quality yet low‐cost products and processes. This article mainly focuses on the issue of constructing cheap metamodels as alternatives to the expensive computer simulators and proposes a new metamodeling method on the basis of the Gaussian stochastic process model or Gaussian Kriging. Rather than a constant mean as in ordinary Kriging or a fixed mean function as in universal Kriging, the new method captures the overall trend of the performance characteristics of products and processes through a more accurate mean, by efficiently incorporating a scheme of sparseness prior–based Bayesian inference into Kriging. Meanwhile, the mean model is able to adaptively exclude the unimportant effects that deteriorate the prediction performance. The results of an experiment on empirical applications demonstrate that, compared with several benchmark methods in the literature, the proposed Bayesian method is not only much more effective in approximation but also very efficient in implementation, hence more appropriate than the widely used ordinary Kriging to empirical applications in the real world. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
Risk indicators can provide useful input to risk management processes and are given increased attention in the Norwegian petroleum industry. Examples include indicators expressing the proportion of test failures of safety and barrier systems. Such indicators give valuable information about the performance of the systems and provide a basis for trend evaluations. Early warning of a possible deterioration is essential due to the importance of the systems in focus, but what should be the basis for the warning criterion? This paper presents and discusses several Bayesian approaches for the establishment of a warning criterion to disclose significant deterioration. The Norwegian petroleum industry is the starting point for this paper, but the study is relevant for other application areas as well.  相似文献   

9.
A new kriging predictor is proposed that gives a better performance over the existing predictor when the constant mean assumption in the kriging model is unreasonable. Moreover, it seems to be robust to the misspecifications in the correlation parameters. The advantages of the new predictor are demonstrated using some examples from the computer experiment literature.  相似文献   

10.
Response surface methodologies can reveal important features of complex computer code models. Here, we suggest experimental designs and interpolation methods for extracting nonlinear response surfaces whose roughness varies substantially over the input domain. A sequential design algorithm for cuboid domains is initiated by selecting an extended corner/centre point design for the entire domain, then updated by decomposing this domain into disjoint cuboids and taking the corners and centre of these cuboids as new design points. A roughness criterion is used to control the domain decomposition so that the design becomes space-filling and the coverage is particularly good in the parts of the input domain where the response surface is strongly nonlinear. Finally, the model output at untried inputs is predicted by carefully selecting a local neighbourhood of each new point in the input space and fitting a full quadratic polynomial to the data points in that neighbourhood. Test runs showed that our sequential design algorithm automatically adapts to the nonlinear features of the model output. Moreover, our technique is particularly useful for extracting nonlinear response surfaces from computer code models with two to seven input variables. A simple modification of the outlined algorithm enables adequate handling of non-cuboid input domains.  相似文献   

11.
ABSTRACT

The design of an experiment can always be considered at least implicitly Bayesian, with prior knowledge used informally to aid decisions such as the variables to be studied and the choice of a plausible relationship between the explanatory variables and measured responses. Bayesian methods allow uncertainty in these decisions to be incorporated into design selection through prior distributions that encapsulate information available from scientific knowledge or previous experimentation. Further, a design may be explicitly tailored to the aim of the experiment through a decision-theoretic approach using an appropriate loss function. We review the area of decision-theoretic Bayesian design, with particular emphasis on recent advances in computational methods. For many problems arising in industry and science, experiments result in a discrete response that is well described by a member of the class of generalized linear models. Bayesian design for such nonlinear models is often seen as impractical as the expected loss is analytically intractable and numerical approximations are usually computationally expensive. We describe how Gaussian process emulation, commonly used in computer experiments, can play an important role in facilitating Bayesian design for realistic problems. A main focus is the combination of Gaussian process regression to approximate the expected loss with cyclic descent (coordinate exchange) optimization algorithms to allow optimal designs to be found for previously infeasible problems. We also present the first optimal design results for statistical models formed from dimensional analysis, a methodology widely employed in the engineering and physical sciences to produce parsimonious and interpretable models. Using the famous paper helicopter experiment, we show the potential for the combination of Bayesian design, generalized linear models, and dimensional analysis to produce small but informative experiments.  相似文献   

12.
We calibrate a stochastic computer simulation model of “moderate” computational expense. The simulator is an imperfect representation of reality, and we recognize this discrepancy to ensure a reliable calibration. The calibration model combines a Gaussian process emulator of the likelihood surface with importance sampling. Changing the discrepancy specification changes only the importance weights, which lets us investigate sensitivity to different discrepancy specifications at little computational cost. We present a case study of a natural history model that has been used to characterize UK bowel cancer incidence. Datasets and computer code are provided as supplementary material.  相似文献   

13.
Monte Carlo simulation plays a significant role in the mechanical and structural analysis due to its versatility and accuracy. Classical spectral representation method is based on the direct decomposition of the power spectral density (PSD) or evolutionary power spectral density (EPSD) matrix through Cholesky decomposition. This direct decomposition of complex matrix usually results in large computational time and storage memory.In this study, a new formulation of the Cholesky decomposition for the EPSD/PSD matrix and corresponding simulation scheme are presented. The key idea to this approach is to separate the phase from the complex EPSD/PSD matrix. The derived real modulus matrix evidently expedites decomposition compared to the direct Cholesky decomposition of the complex EPSD/PSD matrix. In the proposed simulation scheme, the separated phase can be easily assembled. The modulus of EPSD/PSD matrix could be further decomposed into the modulus of coherence matrix (or lagged coherence matrix), which describes the basic coherence structure of stochastic process. The lagged coherence matrix is independence of time and thus remarkably improves the Cholesky decomposition efficiency.The application of the proposed schemes to Gaussian stochastic simulations is presented. Firstly, the previous closed-form wind speed simulation algorithm for equally-spaced locations is extended to a more general situation. Secondly, the proposed approach facilitates the application of interpolation technique in stochastic simulation. The application of interpolation techniques in the wind field simulation is studied as an example.  相似文献   

14.
Control chart could effectively reflect whether a manufacturing process is currently under control or not. The calculation of control limits of the control chart has been focusing on traditional frequency approach, which requires a large sample size for an accurate estimation. A conjugate Bayesian approach is introduced to correct the calculation error of control limits with traditional frequency approach in multi-batch and low volume production. Bartlett’s test, analysis of variance test and standardisation treatment are used to construct a proper prior distribution in order to calculate the Bayes estimators of process distribution parameters for the control limits. The case study indicates that this conjugate Bayesian approach presents better performance than the traditional frequency approach when the sample size is small.  相似文献   

15.
A Bayesian analogue of the Shewhart X‐bar chart is defined and compared with cumulative sum charts. The comparison identifies types of production process where the Bayesian chart has better expected performance than the cumulative sum chart. Implementing the Bayesian chart requires more detailed knowledge of the process structure than is required by the best‐known types of charts, but acquiring this information can yield tangible benefits. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

16.
The failure pattern of repairable mechanical equipment subject to deterioration phenomena sometimes shows a finite bound for the increasing failure intensity. A non-homogeneous Poisson process with bounded increasing failure intensity is then illustrated and its characteristics are discussed. A Bayesian procedure, based on prior information on model-free quantities, is developed in order to allow technical information on the failure process to be incorporated into the inferential procedure and to improve the inference accuracy. Posterior estimation of the model-free quantities and of other quantities of interest (such as the optimal replacement interval) is provided, as well as prediction on the waiting time to the next failure and on the number of failures in a future time interval is given. Finally, numerical examples are given to illustrate the proposed inferential procedure.  相似文献   

17.
Statistical intervals, properly calculated from sample data, are likely to be substantially more informative to decision makers than obtaining a point estimate alone and are often of paramount interest to practitioners and thus management (and are usually a great deal more meaningful than statistical significance or hypothesis tests). Wolfinger (1998, J Qual Technol 36:162–170) presented a simulation-based approach for determining Bayesian tolerance intervals in a balanced one-way random effects model. In this note the theory and results of Wolfinger are extended to the balanced two-factor nested random effects model. The example illustrates the flexibility and unique features of the Bayesian simulation method for the construction of tolerance intervals.   相似文献   

18.
This study presents multi-level analyses for single- and multi-vehicle crashes on a mountainous freeway. Data from a 15-mile mountainous freeway section on I-70 were investigated. Both aggregate and disaggregate models for the two crash conditions were developed. Five years of crash data were used in the aggregate investigation, while the disaggregate models utilized one year of crash data along with real-time traffic and weather data. For the aggregate analyses, safety performance functions were developed for the purpose of revealing the contributing factors for each crash type. Two methodologies, a Bayesian bivariate Poisson-lognormal model and a Bayesian hierarchical Poisson model with correlated random effects, were estimated to simultaneously analyze the two crash conditions with consideration of possible correlations. Except for the factors related to geometric characteristics, two exposure parameters (annual average daily traffic and segment length) were included. Two different sets of significant explanatory and exposure variables were identified for the single-vehicle (SV) and multi-vehicle (MV) crashes. It was found that the Bayesian bivariate Poisson-lognormal model is superior to the Bayesian hierarchical Poisson model, the former with a substantially lower DIC and more significant variables. In addition to the aggregate analyses, microscopic real-time crash risk evaluation models were developed for the two crash conditions. Multi-level Bayesian logistic regression models were estimated with the random parameters accounting for seasonal variations, crash-unit-level diversity and segment-level random effects capturing unobserved heterogeneity caused by the geometric characteristics. The model results indicate that the effects of the selected variables on crash occurrence vary across seasons and crash units; and that geometric characteristic variables contribute to the segment variations: the more unobserved heterogeneity have been accounted, the better classification ability. Potential applications of the modeling results from both analysis approaches are discussed.  相似文献   

19.
Computer model calibration is the process of determining input parameter settings to a computational model that are consistent with physical observations. This is often quite challenging due to the computational demands of running the model. In this article, we use the ensemble Kalman filter (EnKF) for computer model calibration. The EnKF has proven effective in quantifying uncertainty in data assimilation problems such as weather forecasting and ocean modeling. We find that the EnKF can be directly adapted to Bayesian computer model calibration. It is motivated by the mean and covariance relationship between the model inputs and outputs, producing an approximate posterior ensemble of the calibration parameters. While this approach may not fully capture effects due to nonlinearities in the computer model response, its computational efficiency makes it a viable choice for exploratory analyses, design problems, or problems with large numbers of model runs, inputs, and outputs.  相似文献   

20.
Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号