共查询到20条相似文献,搜索用时 15 毫秒
1.
Sequential experiment design strategies have been proposed for efficiently augmenting initial designs to solve many problems of interest to computer experimenters, including optimization, contour and threshold estimation, and global prediction. We focus on batch sequential design strategies for achieving maturity in global prediction of discrepancy inferred from computer model calibration. Predictive maturity focuses on adding field experiments to efficiently improve discrepancy inference. Several design criteria are extended to allow batch augmentation, including integrated and maximum mean square error, maximum entropy, and two expected improvement criteria. In addition, batch versions of maximin distance and weighted distance criteria are developed. Two batch optimization algorithms are considered: modified Fedorov exchange and a binning methodology motivated by optimizing augmented fractional factorial skeleton designs. 相似文献
2.
Karen Kafadar 《技术计量学》2013,55(1):2-4
This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: first, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical expected improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Second, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material available online. The proposed criterion is available in the R package DiceOptim. 相似文献
3.
Space-filling designs are popular choices for computer experiments. A sliced design is a design that can be partitioned into several subdesigns. We propose a new type of sliced space-filling design called sliced rotated sphere packing designs. Their full designs and subdesigns are rotated sphere packing designs. They are constructed by rescaling, rotating, translating, and extracting the points from a sliced lattice. We provide two fast algorithms to generate such designs. Furthermore, we propose a strategy to use sliced rotated sphere packing designs adaptively. Under this strategy, initial runs are uniformly distributed in the design space, follow-up runs are added by incorporating information gained from initial runs, and the combined design is space-filling for any local region. Examples are given to illustrate its potential application. 相似文献
4.
Edward V. Thomas 《技术计量学》2013,55(4):464-465
Tuning and calibration are processes for improving the representativeness of a computer simulation code to a physical phenomenon. This article introduces a statistical methodology for simultaneously determining tuning and calibration parameters in settings where data are available from a computer code and the associated physical experiment. Tuning parameters are set by minimizing a discrepancy measure while the distribution of the calibration parameters are determined based on a hierarchical Bayesian model. The proposed Bayesian model views the output as a realization of a Gaussian stochastic process with hyper-priors. Draws from the resulting posterior distribution are obtained by the Markov chain Monte Carlo simulation. Our methodology is compared with an alternative approach in examples and is illustrated in a biomechanical engineering application. Supplemental materials, including the software and a user manual, are available online and can be requested from the first author. 相似文献
5.
Robert B. Gramacy Genetha A. Gray Sébastien Le Digabel Herbert K. H. Lee Pritam Ranjan Garth Wells 《技术计量学》2016,58(1):1-11
Constrained blackbox optimization is a difficult problem, with most approaches coming from the mathematical programming literature. The statistical literature is sparse, especially in addressing problems with nontrivial constraints. This situation is unfortunate because statistical methods have many attractive properties: global scope, handling noisy objectives, sensitivity analysis, and so forth. To narrow that gap, we propose a combination of response surface modeling, expected improvement, and the augmented Lagrangian numerical optimization framework. This hybrid approach allows the statistical model to think globally and the augmented Lagrangian to act locally. We focus on problems where the constraints are the primary bottleneck, requiring expensive simulation to evaluate and substantial modeling effort to map out. In that context, our hybridization presents a simple yet effective solution that allows existing objective-oriented statistical approaches, like those based on Gaussian process surrogates and expected improvement heuristics, to be applied to the constrained setting with minor modification. This work is motivated by a challenging, real-data benchmark problem from hydrology where, even with a simple linear objective function, learning a nontrivial valid region complicates the search for a global minimum. Supplementary materials for this article are available online. 相似文献
6.
This article considers computer experiments where levels for continuous factors are selected in sequential order with the level selected for one factor directly affecting the range of possible levels for the nested factor and, so on, for a finite number of factors. In addition, we assume that the nested relationships between the factors have no closed form solution. We propose an approach for constructing a multilayer nested factor design or multi-NFD for short. This space-filling design approach takes advantage of the maximin criterion and can be analyzed using a standard Gaussian process model. While the multi-NFD approach can be adapted for future computer experiments involving factor relationships of this type, we present results from a particular aerospace computer simulation study. 相似文献
7.
Computer experiments have received a great deal of attention in many fields of science and technology. Most literature assumes that all the input variables are quantitative. However, researchers often encounter computer experiments involving both qualitative and quantitative variables (BQQV). In this article, a new interface on design and analysis for computer experiments with BQQV is proposed. The new designs are one kind of sliced Latin hypercube designs with points clustered in the design region and possess good uniformity for each slice. For computer experiments with BQQV, such designs help to measure the similarities among responses of different level-combinations in the qualitative variables. An adaptive analysis strategy intended for the proposed designs is developed. The proposed strategy allows us to automatically extract information from useful auxiliary responses to increase the precision of prediction for the target response. The interface between the proposed design and the analysis strategy is demonstrated to be effective via simulation and a real-life example from the food engineering literature. Supplementary materials for this article are available online. 相似文献
8.
A model-based method for organizing tasks in product development 总被引:32,自引:4,他引:32
Steven D. Eppinger Daniel E. Whitney Robert P. Smith David A. Gebala 《Research in Engineering Design》1994,6(1):1-13
This research is aimed at structuring complex design projects in order to develop better products more quickly. We use a matrix representation to capture both the sequence of and the technical relationships among the many design tasks to be performed. These relationships define the technical structure of a project, which is then analyzed in order to find alternative sequences and/or definitions of the tasks. Such improved design procedures offer opportunities to speed development progress by streamlining the inter-task coordination. After using this technique to model design processes in several organizations, we have developed a design management strategy which focuses attention on the essential information transfer requirements of a technical project. 相似文献
9.
This article is motivated by a computer experiment conducted for optimizing residual stresses in the machining of metals. Although kriging is widely used in the analysis of computer experiments, it cannot be easily applied to model the residual stresses because they are obtained as a profile. The high dimensionality caused by this functional response introduces severe computational challenges in kriging. It is well known that if the functional data are observed on a regular grid, the computations can be simplified using an application of Kronecker products. However, the case of irregular grid is quite complex. In this article, we develop a Gibbs sampling-based expectation maximization algorithm, which converts the irregularly spaced data into a regular grid so that the Kronecker product-based approach can be employed for efficiently fitting a kriging model to the functional data. Supplementary materials are available online. 相似文献
10.
11.
《技术计量学》2013,55(4):527-541
Computer simulation often is used to study complex physical and engineering processes. Although a computer simulator often can be viewed as an inexpensive way to gain insight into a system, it still can be computationally costly. Much of the recent work on the design and analysis of computer experiments has focused on scenarios where the goal is to fit a response surface or process optimization. In this article we develop a sequential methodology for estimating a contour from a complex computer code. The approach uses a stochastic process model as a surrogate for the computer simulator. The surrogate model and associated uncertainty are key components in a new criterion used to identify the computer trials aimed specifically at improving the contour estimate. The proposed approach is applied to exploration of a contour for a network queuing system. Issues related to practical implementation of the proposed approach also are addressed. 相似文献
12.
A mixture experiment is characterized by having two or more inputs that are specified as a percentage contribution to a total amount of material. In such situations, the input variables are correlated because they must sum to one. Consequently, additional care must be taken when fitting statistical models or visualizing the effect of one or more inputs on the response. In this article, we consider the use of a Gaussian process to model the output from a computer simulator taking a mixture input. We introduce a procedure to perform global sensitivity analysis of the code output providing main effects and revealing interactions. The resulting methodology is illustrated using a function with analytically tractable results for comparison, a chemical compositional simulator, and a physical experiment. Supplementary materials providing assistance with implementing this methodology are available online. 相似文献
13.
Matthias Hwai Yong Tan 《技术计量学》2017,59(1):1-10
In deterministic computer experiments, it is often known that the output is a monotonic function of some of the inputs. In these cases, a monotonic metamodel will tend to give more accurate and interpretable predictions with less prediction uncertainty than a nonmonotonic metamodel. The widely used Gaussian process (GP) models are not monotonic. A recent article in Biometrika offers a modification that projects GP sample paths onto the cone of monotonic functions. However, their approach does not account for the fact that the GP model is more informative about the true function at locations near design points than at locations far away. Moreover, a grid-based method is used, which is memory intensive and gives predictions only at grid points. This article proposes the weighted projection approach that more effectively uses information in the GP model together with two computational implementations. The first is isotonic regression on a grid while the second is projection onto a cone of monotone splines, which alleviates problems faced by a grid-based approach. Simulations show that the monotone B-spline metamodel gives particularly good results. Supplementary materials for this article are available online. 相似文献
14.
Ying Xiong Wei Chen Daniel Apley Xuru Ding 《International journal for numerical methods in engineering》2007,71(6):733-756
Metamodels are widely used to facilitate the analysis and optimization of engineering systems that involve computationally expensive simulations. Kriging is a metamodelling technique that is well known for its ability to build surrogate models of responses with non‐linear behaviour. However, the assumption of a stationary covariance structure underlying Kriging does not hold in situations where the level of smoothness of a response varies significantly. Although non‐stationary Gaussian process models have been studied for years in statistics and geostatistics communities, this has largely been for physical experimental data in relatively low dimensions. In this paper, the non‐stationary covariance structure is incorporated into Kriging modelling for computer simulations. To represent the non‐stationary covariance structure, we adopt a non‐linear mapping approach based on parameterized density functions. To avoid over‐parameterizing for the high dimension problems typical of engineering design, we propose a modified version of the non‐linear map approach, with a sparser, yet flexible, parameterization. The effectiveness of the proposed method is demonstrated through both mathematical and engineering examples. The robustness of the method is verified by testing multiple functions under various sampling settings. We also demonstrate that our method is effective in quantifying prediction uncertainty associated with the use of metamodels. Copyright © 2006 John Wiley & Sons, Ltd. 相似文献
15.
Metamodel-based method is a wise reliability analysis technique because it uses the metamodel to substitute the actual limit state function under the predefined accuracy. Adaptive Kriging (AK) is a famous metamodel in reliability analysis for its flexibility and efficiency. AK combined with the importance sampling (IS) method abbreviate as AK–IS can extremely reduce the size of candidate sampling pool in the updating process of Kriging model, which makes the AK-based reliability method more suitable for estimating the small failure probability. In this paper, an error-based stopping criterion of updating the Kriging model in the AK–IS method is constructed and two considerable maximum relative error estimation methods between the failure probability estimated by the current Kriging model and the limit state function are derived. By controlling the maximum relative error, the accuracy of the estimate can be adjusted flexibly. Results in three case studies show that the error-based stopping criterion based AK–IS method can achieve the predefined accuracy level and simultaneously enhance the efficiency of updating the Kriging model. 相似文献
16.
基于Kriging 代理模型提出了一种同时考虑预测响应值及其不确定性的多点加点准则,并基于该准则发展了一套序列近似优化方法。多点加点准则基于初始样本信息和所预测的对象函数特征增加新样本集,以在寻优迭代过程中自适应地提高代理模型的精度。该文方法依据多点加点准则在一次迭代中增加多个空间无关的新样本点,适用于多机同时计算或并行计算,从而提高计算效率。以两个经典的数学函数为例,将该优化方法与期望提高准则方法进行了比较,结果表明该文提出的优化方法能够有效地提高最优解的全局性。将方法用于一盒式注塑件的成型工艺优化设计,优化结果也表明了该方法的有效性。 相似文献
17.
We investigate the merits of replication, and provide methods for optimal design (including replicates), with the goal of obtaining globally accurate emulation of noisy computer simulation experiments. We first show that replication can be beneficial from both design and computational perspectives, in the context of Gaussian process surrogate modeling. We then develop a lookahead-based sequential design scheme that can determine if a new run should be at an existing input location (i.e., replicate) or at a new one (explore). When paired with a newly developed heteroscedastic Gaussian process model, our dynamic design scheme facilitates learning of signal and noise relationships which can vary throughout the input space. We show that it does so efficiently, on both computational and statistical grounds. In addition to illustrative synthetic examples, we demonstrate performance on two challenging real-data simulation experiments, from inventory management and epidemiology. Supplementary materials for the article are available online. 相似文献
18.
Max D. Morris 《技术计量学》2013,55(1):42-50
Computer models of dynamic systems produce outputs that are functions of time; models that solve systems of differential equations often have this character. In many cases, time series output can be usefully reduced via principal components to simplify analysis. Time-indexed inputs, such as the functions that describe time-varying boundary conditions, are also common with such models. However, inputs that are functions of time often do not have one or a few “characteristic shapes” that are more common with output functions, and so, principal component representation has less potential for reducing the dimension of input functions. In this article, Gaussian process surrogates are described for models with inputs and outputs that are both functions of time. The focus is on construction of an appropriate covariance structure for such surrogates, some experimental design issues, and an application to a model of marrow cell dynamics. 相似文献
19.
Thomas Mc Neill Professor John S. Gero James Warren 《Research in Engineering Design》1998,10(3):129-140
20.
Mohamed Amine Bouhlel Nathalie Bartoli Rommel G. Regis Abdelkader Otsmane Joseph Morlier 《工程优选》2018,50(12):2038-2053
In many engineering optimization problems, the number of function evaluations is often very limited because of the computational cost to run one high-fidelity numerical simulation. Using a classic optimization algorithm, such as a derivative-based algorithm or an evolutionary algorithm, directly on a computational model is not suitable in this case. A common approach to addressing this challenge is to use black-box surrogate modelling techniques. The most popular surrogate-based optimization algorithm is the efficient global optimization (EGO) algorithm, which is an iterative sampling algorithm that adds one (or many) point(s) per iteration. This algorithm is often based on an infill sampling criterion, called expected improvement, which represents a trade-off between promising and uncertain areas. Many studies have shown the efficiency of EGO, particularly when the number of input variables is relatively low. However, its performance on high-dimensional problems is still poor since the Kriging models used are time-consuming to build. To deal with this issue, this article introduces a surrogate-based optimization method that is suited to high-dimensional problems. The method first uses the ‘locating the regional extreme’ criterion, which incorporates minimizing the surrogate model while also maximizing the expected improvement criterion. Then, it replaces the Kriging models by the KPLS(+K) models (Kriging combined with the partial least squares method), which are more suitable for high-dimensional problems. Finally, the proposed approach is validated by a comparison with alternative methods existing in the literature on some analytical functions and on 12-dimensional and 50-dimensional instances of the benchmark automotive problem ‘MOPTA08’. 相似文献