首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 22 毫秒
1.
Computer experiments with qualitative and quantitative factors occur frequently in various applications in science and engineering. Analysis of such experiments is not yet completely resolved. In this work, we propose an additive Gaussian process model for computer experiments with qualitative and quantitative factors. The proposed method considers an additive correlation structure for qualitative factors, and assumes that the correlation function for each qualitative factor and the correlation function of quantitative factors are multiplicative. It inherits the flexibility of unrestrictive correlation structure for qualitative factors by using the hypersphere decomposition, embracing more flexibility in modeling the complex systems of computer experiments. The merits of the proposed method are illustrated by several numerical examples and a real data application. Supplementary materials for this article are available online.  相似文献   

2.
Sequential experiment design strategies have been proposed for efficiently augmenting initial designs to solve many problems of interest to computer experimenters, including optimization, contour and threshold estimation, and global prediction. We focus on batch sequential design strategies for achieving maturity in global prediction of discrepancy inferred from computer model calibration. Predictive maturity focuses on adding field experiments to efficiently improve discrepancy inference. Several design criteria are extended to allow batch augmentation, including integrated and maximum mean square error, maximum entropy, and two expected improvement criteria. In addition, batch versions of maximin distance and weighted distance criteria are developed. Two batch optimization algorithms are considered: modified Fedorov exchange and a binning methodology motivated by optimizing augmented fractional factorial skeleton designs.  相似文献   

3.
Gaussian process (GP) is a popular method for emulating deterministic computer simulation models. Its natural extension to computer models with multivariate outputs employs a multivariate Gaussian process (MGP) framework. Nevertheless, with significant increase in the number of design points and the number of model parameters, building an MGP model is a very challenging task. Under a general MGP model framework with nonseparable covariance functions, we propose an efficient meta-modeling approach featuring a pairwise model building scheme. The proposed method has excellent scalability even for a large number of output levels. Some properties of the proposed method have been investigated and its performance has been demonstrated through several numerical examples. Supplementary materials for this article are available online.  相似文献   

4.
For deterministic computer simulations, Gaussian process models are a standard procedure for fitting data. These models can be used only when the study design avoids having replicated points. This characteristic is also desirable for one-dimensional projections of the design, since it may happen that one of the design factors has a strongly nonlinear effect on the response. Latin hypercube designs have uniform one-dimensional projections, but are not efficient for fitting low-order polynomials when there is a small error variance. D-optimal designs are very efficient for polynomial fitting but have substantial replication in projections. We propose a new class of designs that bridge the gap between D-optimal designs and D-optimal Latin hypercube designs. These designs guarantee a minimum distance between points in any one-dimensional projection allowing for the fit of either polynomial or Gaussian process models. Subject to this constraint they are D-optimal for a prespecified model.  相似文献   

5.
Computer experiments have received a great deal of attention in many fields of science and technology. Most literature assumes that all the input variables are quantitative. However, researchers often encounter computer experiments involving both qualitative and quantitative variables (BQQV). In this article, a new interface on design and analysis for computer experiments with BQQV is proposed. The new designs are one kind of sliced Latin hypercube designs with points clustered in the design region and possess good uniformity for each slice. For computer experiments with BQQV, such designs help to measure the similarities among responses of different level-combinations in the qualitative variables. An adaptive analysis strategy intended for the proposed designs is developed. The proposed strategy allows us to automatically extract information from useful auxiliary responses to increase the precision of prediction for the target response. The interface between the proposed design and the analysis strategy is demonstrated to be effective via simulation and a real-life example from the food engineering literature. Supplementary materials for this article are available online.  相似文献   

6.
《技术计量学》2013,55(4):527-541
Computer simulation often is used to study complex physical and engineering processes. Although a computer simulator often can be viewed as an inexpensive way to gain insight into a system, it still can be computationally costly. Much of the recent work on the design and analysis of computer experiments has focused on scenarios where the goal is to fit a response surface or process optimization. In this article we develop a sequential methodology for estimating a contour from a complex computer code. The approach uses a stochastic process model as a surrogate for the computer simulator. The surrogate model and associated uncertainty are key components in a new criterion used to identify the computer trials aimed specifically at improving the contour estimate. The proposed approach is applied to exploration of a contour for a network queuing system. Issues related to practical implementation of the proposed approach also are addressed.  相似文献   

7.
Sliced Latin hypercube designs (SLHDs) have important applications in designing computer experiments with continuous and categorical factors. However, a randomly generated SLHD can be poor in terms of space-filling, and based on the existing construction method that generates the SLHD column by column using sliced permutation matrices, it is also difficult to search for the optimal SLHD. In this article, we develop a new construction approach that first generates the small Latin hypercube design in each slice and then arranges them together to form the SLHD. The new approach is intuitive and can be easily adapted to generate orthogonal SLHDs and orthogonal array-based SLHDs. More importantly, it enables us to develop general algorithms that can search for the optimal SLHD efficiently.  相似文献   

8.
In the past two decades, more and more quality and reliability activities have been moving into the design of product and process. The design and analysis of computer experiments, as a new frontier of the design of experiments, has become increasingly popular among modern companies for optimizing product and process conditions and producing high‐quality yet low‐cost products and processes. This article mainly focuses on the issue of constructing cheap metamodels as alternatives to the expensive computer simulators and proposes a new metamodeling method on the basis of the Gaussian stochastic process model or Gaussian Kriging. Rather than a constant mean as in ordinary Kriging or a fixed mean function as in universal Kriging, the new method captures the overall trend of the performance characteristics of products and processes through a more accurate mean, by efficiently incorporating a scheme of sparseness prior–based Bayesian inference into Kriging. Meanwhile, the mean model is able to adaptively exclude the unimportant effects that deteriorate the prediction performance. The results of an experiment on empirical applications demonstrate that, compared with several benchmark methods in the literature, the proposed Bayesian method is not only much more effective in approximation but also very efficient in implementation, hence more appropriate than the widely used ordinary Kriging to empirical applications in the real world. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

9.
Computer models of physical systems are often written based on known theory or “first principles” of a system, reflecting substantial knowledge of each component or subsystem, but also the need to use a numerical approach to mimic the more complex behavior of the entire system of interest. However, in some cases, there is insufficient known theory to encode all necessary aspects of the system, and empirical studies are required to generate approximate functional forms. We consider the question of how a physical experiment might be designed to approximate one module or subroutine of a computer model that can otherwise be written from first principles. The concept of preposterior analysis is used to suggest an approach to generating a kind of I-optimal design for this purpose, when the remainder of the computer model is a composition of nonlinear functions that can be directly evaluated as part of the design process. Extensions are then described for situations in which one or more known components must themselves be approximated by metamodels due to the large number of evaluations needed, and for computer models that have iterative structure. A simple “toy” model is used to demonstrate the ideas. Online supplementary material accompanies this article.  相似文献   

10.
A single-index model (SIM) provides for parsimonious multidimensional nonlinear regression by combining parametric (linear) projection with univariate nonparametric (nonlinear) regression models. We show that a particular Gaussian process (GP) formulation is simple to work with and ideal as an emulator for some types of computer experiment as it can outperform the canonical separable GP regression model commonly used in this setting. Our contribution focuses on drastically simplifying, reinterpreting, and then generalizing a recently proposed fully Bayesian GP-SIM combination. Favorable performance is illustrated on synthetic data and a real-data computer experiment. Two R packages, both released on CRAN, have been augmented to facilitate inference under our proposed model(s).  相似文献   

11.
This article is motivated by a computer experiment conducted for optimizing residual stresses in the machining of metals. Although kriging is widely used in the analysis of computer experiments, it cannot be easily applied to model the residual stresses because they are obtained as a profile. The high dimensionality caused by this functional response introduces severe computational challenges in kriging. It is well known that if the functional data are observed on a regular grid, the computations can be simplified using an application of Kronecker products. However, the case of irregular grid is quite complex. In this article, we develop a Gibbs sampling-based expectation maximization algorithm, which converts the irregularly spaced data into a regular grid so that the Kronecker product-based approach can be employed for efficiently fitting a kriging model to the functional data. Supplementary materials are available online.  相似文献   

12.
Tuning and calibration are processes for improving the representativeness of a computer simulation code to a physical phenomenon. This article introduces a statistical methodology for simultaneously determining tuning and calibration parameters in settings where data are available from a computer code and the associated physical experiment. Tuning parameters are set by minimizing a discrepancy measure while the distribution of the calibration parameters are determined based on a hierarchical Bayesian model. The proposed Bayesian model views the output as a realization of a Gaussian stochastic process with hyper-priors. Draws from the resulting posterior distribution are obtained by the Markov chain Monte Carlo simulation. Our methodology is compared with an alternative approach in examples and is illustrated in a biomechanical engineering application. Supplemental materials, including the software and a user manual, are available online and can be requested from the first author.  相似文献   

13.
14.
Profile monitoring is often conducted when the product quality is characterized by profiles. Although existing methods almost exclusively deal with univariate profiles, observations of multivariate profile data are increasingly encountered in practice. These data are seldom analyzed in the area of statistical process control due to lack of effective modeling tools. In this article, we propose to analyze them using the multivariate Gaussian process model, which offers a natural way to accommodate both within-profile and between-profile correlations. To mitigate the prohibitively high computation in building such models, a pairwise estimation strategy is adopted. Asymptotic normality of the parameter estimates from this approach has been established. Comprehensive simulation studies are conducted. In the case study, the method has been demonstrated using transmittance profiles from low-emittance glass. Supplementary materials for this article are available online.  相似文献   

15.
Global sensitivity analysis of complex numerical models can be performed by calculating variance-based importance measures of the input variables, such as the Sobol indices. However, these techniques, requiring a large number of model evaluations, are often unacceptable for time expensive computer codes. A well-known and widely used decision consists in replacing the computer code by a metamodel, predicting the model responses with a negligible computation time and rending straightforward the estimation of Sobol indices. In this paper, we discuss about the Gaussian process model which gives analytical expressions of Sobol indices. Two approaches are studied to compute the Sobol indices: the first based on the predictor of the Gaussian process model and the second based on the global stochastic process model. Comparisons between the two estimates, made on analytical examples, show the superiority of the second approach in terms of convergence and robustness. Moreover, the second approach allows to integrate the modeling error of the Gaussian process model by directly giving some confidence intervals on the Sobol indices. These techniques are finally applied to a real case of hydrogeological modeling.  相似文献   

16.
A mixture experiment is characterized by having two or more inputs that are specified as a percentage contribution to a total amount of material. In such situations, the input variables are correlated because they must sum to one. Consequently, additional care must be taken when fitting statistical models or visualizing the effect of one or more inputs on the response. In this article, we consider the use of a Gaussian process to model the output from a computer simulator taking a mixture input. We introduce a procedure to perform global sensitivity analysis of the code output providing main effects and revealing interactions. The resulting methodology is illustrated using a function with analytically tractable results for comparison, a chemical compositional simulator, and a physical experiment. Supplementary materials providing assistance with implementing this methodology are available online.  相似文献   

17.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi-fidelity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. From simulation results and a real example using finite element analysis, our method outperforms the expected improvement (EI) criterion that works for single-accuracy experiments. Supplementary materials for this article are available online.  相似文献   

18.
The calibration of computer models using physical experimental data has received a compelling interest in the last decade. Recently, multiple works have addressed the functional calibration of computer models, where the calibration parameters are functions of the observable inputs rather than taking a set of fixed values as traditionally treated in the literature. While much of the recent work on functional calibration was focused on estimation, the issue of sequential design for functional calibration still presents itself as an open question. Addressing the sequential design issue is thus the focus of this article. We investigate different sequential design approaches and show that the simple separate design approach has its merit in practical use when designing for functional calibration. Analysis is carried out on multiple simulated and real-world examples.  相似文献   

19.
Robust parameter design with computer experiments is becoming increasingly important for product design. Existing methodologies for this problem are mostly for finding optimal control factor settings. However, in some cases, the objective of the experimenter may be to understand how the noise and control factors contribute to variation in the response. The functional analysis of variance (ANOVA) and variance decompositions of the response, in addition to the mean and variance models, help achieve this objective. Estimation of these quantities is not easy and few methods are able to quantity the estimation uncertainty. In this article, we show that the use of an orthonormal polynomial model of the simulator leads to simple formulas for functional ANOVA and variance decompositions, and the mean and variance models. We show that estimation uncertainty can be taken into account in a simple way by first fitting a Gaussian process model to experiment data and then approximating it with the orthonormal polynomial model. This leads to a joint normal distribution for the polynomial coefficients that quantifies estimation uncertainty. Supplementary materials for this article are available online.  相似文献   

20.
We calibrate a stochastic computer simulation model of “moderate” computational expense. The simulator is an imperfect representation of reality, and we recognize this discrepancy to ensure a reliable calibration. The calibration model combines a Gaussian process emulator of the likelihood surface with importance sampling. Changing the discrepancy specification changes only the importance weights, which lets us investigate sensitivity to different discrepancy specifications at little computational cost. We present a case study of a natural history model that has been used to characterize UK bowel cancer incidence. Datasets and computer code are provided as supplementary material.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号