首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到11条相似文献,搜索用时 0 毫秒
1.
Profile monitoring is often conducted when the product quality is characterized by profiles. Although existing methods almost exclusively deal with univariate profiles, observations of multivariate profile data are increasingly encountered in practice. These data are seldom analyzed in the area of statistical process control due to lack of effective modeling tools. In this article, we propose to analyze them using the multivariate Gaussian process model, which offers a natural way to accommodate both within-profile and between-profile correlations. To mitigate the prohibitively high computation in building such models, a pairwise estimation strategy is adopted. Asymptotic normality of the parameter estimates from this approach has been established. Comprehensive simulation studies are conducted. In the case study, the method has been demonstrated using transmittance profiles from low-emittance glass. Supplementary materials for this article are available online.  相似文献   

2.
Computer models of dynamic systems produce outputs that are functions of time; models that solve systems of differential equations often have this character. In many cases, time series output can be usefully reduced via principal components to simplify analysis. Time-indexed inputs, such as the functions that describe time-varying boundary conditions, are also common with such models. However, inputs that are functions of time often do not have one or a few “characteristic shapes” that are more common with output functions, and so, principal component representation has less potential for reducing the dimension of input functions. In this article, Gaussian process surrogates are described for models with inputs and outputs that are both functions of time. The focus is on construction of an appropriate covariance structure for such surrogates, some experimental design issues, and an application to a model of marrow cell dynamics.  相似文献   

3.
《技术计量学》2013,55(4):527-541
Computer simulation often is used to study complex physical and engineering processes. Although a computer simulator often can be viewed as an inexpensive way to gain insight into a system, it still can be computationally costly. Much of the recent work on the design and analysis of computer experiments has focused on scenarios where the goal is to fit a response surface or process optimization. In this article we develop a sequential methodology for estimating a contour from a complex computer code. The approach uses a stochastic process model as a surrogate for the computer simulator. The surrogate model and associated uncertainty are key components in a new criterion used to identify the computer trials aimed specifically at improving the contour estimate. The proposed approach is applied to exploration of a contour for a network queuing system. Issues related to practical implementation of the proposed approach also are addressed.  相似文献   

4.
Data center thermal management has become increasingly important because of massive computational demand in information technology. To advance the understanding of the thermal environment in a data center, complex computer models are extensively used to simulate temperature distribution maps. However, due to management policies and time constraints, it is not practical to execute such models in a real time fashion. In this article, we propose a novel statistical modeling method to perform real-time simulation by dynamically fusing a base, steady-state solution of a computer model, and real-time thermal sensor data. The proposed method uses a Kalman filter and stochastic gradient descent method as computational tools to achieve real-time updating of the base temperature map. We evaluate the performance of the proposed method through a simulation study and demonstrate its merits in a data center thermal management application. Supplementary materials for this article are available online.  相似文献   

5.
Abstract

Expensive black box systems arise in many engineering applications but can be difficult to optimize because their output functions may be complex, multi-modal, and difficult to understand. The task becomes even more challenging when the optimization is subject to multiple constraints and no derivative information is available. In this article, we combine response surface modeling and filter methods in order to solve problems of this nature. In employing a filter algorithm for solving constrained optimization problems, we establish a novel probabilistic metric for guiding the filter. Overall, this hybridization of statistical modeling and nonlinear programming efficiently utilizes both global and local search in order to quickly converge to a global solution to the constrained optimization problem. To demonstrate the effectiveness of the proposed methods, we perform numerical tests on a synthetic test problem, a problem from the literature, and a real-world hydrology computer experiment optimization problem.  相似文献   

6.
Gaussian processes have become a standard framework for modeling deterministic computer simulations and producing predictions of the response surface. This article investigates a new covariance function that is shown to offer superior prediction compared to the more common covariances for computer simulations of real physical systems. This is demonstrated via a gamut of realistic examples. A simple, closed-form expression for the covariance is derived as a limiting form of a Brownian-like covariance model as it is extended to some hypothetical higher-dimensional input domain, and so we term it a lifted Brownian covariance. This covariance has connections with the multiquadric kernel. Through analysis of the kriging model, this article offers some theoretical comparisons between the proposed covariance model and existing covariance models. The major emphasis of the theory is explaining why the proposed covariance is superior to its traditional counterparts for many computer simulations of real physical systems. Supplementary materials for this article are available online.  相似文献   

7.
We calibrate a stochastic computer simulation model of “moderate” computational expense. The simulator is an imperfect representation of reality, and we recognize this discrepancy to ensure a reliable calibration. The calibration model combines a Gaussian process emulator of the likelihood surface with importance sampling. Changing the discrepancy specification changes only the importance weights, which lets us investigate sensitivity to different discrepancy specifications at little computational cost. We present a case study of a natural history model that has been used to characterize UK bowel cancer incidence. Datasets and computer code are provided as supplementary material.  相似文献   

8.
Computer models of physical systems are often written based on known theory or “first principles” of a system, reflecting substantial knowledge of each component or subsystem, but also the need to use a numerical approach to mimic the more complex behavior of the entire system of interest. However, in some cases, there is insufficient known theory to encode all necessary aspects of the system, and empirical studies are required to generate approximate functional forms. We consider the question of how a physical experiment might be designed to approximate one module or subroutine of a computer model that can otherwise be written from first principles. The concept of preposterior analysis is used to suggest an approach to generating a kind of I-optimal design for this purpose, when the remainder of the computer model is a composition of nonlinear functions that can be directly evaluated as part of the design process. Extensions are then described for situations in which one or more known components must themselves be approximated by metamodels due to the large number of evaluations needed, and for computer models that have iterative structure. A simple “toy” model is used to demonstrate the ideas. Online supplementary material accompanies this article.  相似文献   

9.
The construction of decision-theoretical Bayesian designs for realistically complex nonlinear models is computationally challenging, as it requires the optimization of analytically intractable expected utility functions over high-dimensional design spaces. We provide the most general solution to date for this problem through a novel approximate coordinate exchange algorithm. This methodology uses a Gaussian process emulator to approximate the expected utility as a function of a single design coordinate in a series of conditional optimization steps. It has flexibility to address problems for any choice of utility function and for a wide range of statistical models with different numbers of variables, numbers of runs and randomization restrictions. In contrast to existing approaches to Bayesian design, the method can find multi-variable designs in large numbers of runs without resorting to asymptotic approximations to the posterior distribution or expected utility. The methodology is demonstrated on a variety of challenging examples of practical importance, including design for pharmacokinetic models and design for mixed models with discrete data. For many of these models, Bayesian designs are not currently available. Comparisons are made to results from the literature, and to designs obtained from asymptotic approximations. Supplementary materials for this article are available online.  相似文献   

10.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi-fidelity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. From simulation results and a real example using finite element analysis, our method outperforms the expected improvement (EI) criterion that works for single-accuracy experiments. Supplementary materials for this article are available online.  相似文献   

11.
Multivariate Markov chain models have previously been proposed in for studying dependent multiple categorical data sequences. For a given multivariate Markov chain model, an important problem is to study its joint stationary distribution. In this paper, we use two techniques to present some perturbation bounds for the joint stationary distribution vector of a multivariate Markov chain with s categorical sequences. Numerical examples demonstrate the stability of the model and the effectiveness of our perturbation bounds.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号