首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
To identify the robust settings of the control factors, it is very important to understand how they interact with the noise factors. In this article, we propose space-filling designs for computer experiments that are more capable of accurately estimating the control-by-noise interactions. Moreover, the existing space-filling designs focus on uniformly distributing the points in the design space, which are not suitable for noise factors because they usually follow nonuniform distributions such as normal distribution. This would suggest placing more points in the regions with high probability mass. However, noise factors also tend to have a smooth relationship with the response and therefore, placing more points toward the tails of the distribution is also useful for accurately estimating the relationship. These two opposing effects make the experimental design methodology a challenging problem. We propose optimal and computationally efficient solutions to this problem and demonstrate their advantages using simulated examples and a real industry example involving a manufacturing packing line. Supplementary materials for the article are available online.  相似文献   

2.
Abstract

Multivariate testing is a popular method to improve websites, mobile apps, and email campaigns. A unique aspect of testing in the online space is that it needs to be conducted across multiple platforms such as a desktop and a smartphone. The existing experimental design literature does not offer precise guidance for such a multi-platform context. In this article, we introduce a multi-platform design framework that allows us to measure the effect of the design factors for each platform and the interaction effect of the design factors with platforms. Substantively, the resulting designs are of great importance for testing digital campaigns across platforms. We illustrate this in an empirical email application to maximize engagement for a digital magazine. We introduce a novel “sliced effect hierarchy principle” and develop design criteria to generate factorial designs for multi-platform experiments. To help construct such designs, we prove a theorem that connects the proposed designs to the well-known minimum aberration designs. We find that experimental versions made for one platform should be similar to other platforms. From the standpoint of real-world application, such homogeneous subdesigns are cheaper to implement. To assist practitioners, we provide an algorithm to construct the designs that we propose.  相似文献   

3.
Sequential experiments composed of initial experiments and follow-up experiments are widely adopted for economical computer emulations. Many kinds of Latin hypercube designs with good space-filling properties have been proposed for designing the initial computer experiments. However, little work based on Latin hypercubes has focused on the design of the follow-up experiments. Although some constructions of nested Latin hypercube designs can be adapted to sequential designs, the size of the follow-up experiments needs to be a multiple of that of the initial experiments. In this article, a general method for constructing sequential designs of flexible size is proposed, which allows the combined designs to have good one-dimensional space-filling properties. Moreover, the sampling properties and a type of central limit theorem are derived for these designs. Several improvements of these designs are made to achieve better space-filling properties. Simulations are carried out to verify the theoretical results. Supplementary materials for this article are available online.  相似文献   

4.
Foldover is a commonly used follow-up strategy in experimental designs. All existing foldover designs were constructed by reversing the sign of columns of the initial design. We propose a new methodology by allowing the permutation of columns in foldover. Focusing on resolution IV designs, we show that almost all designs are better than existing results with respect to the minimum aberration criterion. While augmenting a design by a foldover with column permutations may result in a nonregular combined design, the proposed designs all have a resolution of 4.5 or higher, for which no two-factor interaction is fully aliased with any other two-factor interactions.  相似文献   

5.
The benefits of sequential design of experiments have long been described for both model-based and space-filling designs. However, in our experience, too few practitioners take advantage of the opportunity afforded by this approach to maximize the learning from their experimentation. By obtaining data sequentially, it is possible to learn from the early stages to inform subsequent data collection, minimize wasted resources, and provide answers for a series of objectives for the overall experiment. This paper provides methods and algorithms to create augmented distance-based space-filling designs, using both uniform and non-uniform space-filling strategies, that can be constructed at each stage based on information learned in earlier stages. We illustrate the methods with several examples that involve different initial data, types of space-filing designs and experimental goals.  相似文献   

6.
Robust parameter designs are widely used to produce products/processes that perform consistently well across various conditions known as noise factors. Recently, the robust parameter design method is implemented in computer experiments. The structure of conventional product array design becomes unsuitable due to its extensive number of runs and the polynomial modeling. In this article, we propose a new framework robust parameter design via stochastic approximation (RPD-SA) to efficiently optimize the robust parameter design criteria. It can be applied to general robust parameter design problems, but is particularly powerful in the context of computer experiments. It has the following four advantages: (1) fast convergence to the optimal product setting with fewer number of function evaluations; (2) incorporation of high-order effects of both design and noise factors; (3) adaptation to constrained irregular region of operability; (4) no requirement of statistical analysis phase. In the numerical studies, we compare RPD-SA to the Monte Carlo sampling with Newton–Raphson-type optimization. An “Airfoil” example is used to compare the performance of RPD-SA, conventional product array designs, and space-filling designs with the Gaussian process. The studies show that RPD-SA has preferable performance in terms of effectiveness, efficiency and reliability.  相似文献   

7.
Sliced Latin hypercube designs (SLHDs) have important applications in designing computer experiments with continuous and categorical factors. However, a randomly generated SLHD can be poor in terms of space-filling, and based on the existing construction method that generates the SLHD column by column using sliced permutation matrices, it is also difficult to search for the optimal SLHD. In this article, we develop a new construction approach that first generates the small Latin hypercube design in each slice and then arranges them together to form the SLHD. The new approach is intuitive and can be easily adapted to generate orthogonal SLHDs and orthogonal array-based SLHDs. More importantly, it enables us to develop general algorithms that can search for the optimal SLHD efficiently.  相似文献   

8.
In industrial experiments, restrictions on the execution of the experimental runs or the existence of one or more hard‐to‐change factors often leads to split‐plot experiments, where there are two types of experimental units and two independent randomizations. The resulting compound symmetric error structure, as well as the settings of whole‐plot and subplot factors, play important roles in the performance of split‐plot experiments. When the practitioner is interested in predicting the response, a response surface design for a second‐order model such as a central composite design (CCD) is often used. The prediction variance of second‐order designs under a split‐plot error structure is often of interest. In this paper, fraction of design space (FDS) plots are adapted to split‐plot designs. In addition to the global curve exploring the entire design space, sliced curves at various whole‐plot levels are presented to study prediction performance for subregions in the design space. The different sizes of the constrained subregions are accounted for by the proportional size of the sliced curves. The construction and use of the FDS plots are demonstrated through two examples of the restricted CCD in split‐plot schemes. We also consider the impact of the variance ratio on design performance. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

9.
Computer experiments have received a great deal of attention in many fields of science and technology. Most literature assumes that all the input variables are quantitative. However, researchers often encounter computer experiments involving both qualitative and quantitative variables (BQQV). In this article, a new interface on design and analysis for computer experiments with BQQV is proposed. The new designs are one kind of sliced Latin hypercube designs with points clustered in the design region and possess good uniformity for each slice. For computer experiments with BQQV, such designs help to measure the similarities among responses of different level-combinations in the qualitative variables. An adaptive analysis strategy intended for the proposed designs is developed. The proposed strategy allows us to automatically extract information from useful auxiliary responses to increase the precision of prediction for the target response. The interface between the proposed design and the analysis strategy is demonstrated to be effective via simulation and a real-life example from the food engineering literature. Supplementary materials for this article are available online.  相似文献   

10.
Conventional space-filling experimental design provides uniform coverage of a hypercube design space. When constraints are imposed, the results may contain many infeasible points. Simply omitting these points leads to fewer feasible points than desired and a design of experiments that is not optimally distributed. In this research, an adaptive method is developed to create space-filling points in arbitrarily constrained spaces. First, a design space reconstruction method is developed to reduce the invalid exploration space and enhance the efficiency of experimental designs. Then, a synthetic criterion of uniformity and feasibility is proposed and optimized by the enhanced stochastic evolutionary method to obtain the initial sampling combination. Finally, an adaptive adjustment strategy of design levels is constructed to obtain the required number of feasible points. Various test cases with convex and non-convex, connected and non-connected design spaces are implemented to verify the efficacy of the proposed method.  相似文献   

11.
We propose an approach for constructing a new type of design, called a sliced orthogonal array-based Latin hypercube design. This approach exploits a slicing structure of orthogonal arrays with strength two and makes use of sliced random permutations. Such a design achieves one- and two-dimensional uniformity and can be divided into smaller Latin hypercube designs with one-dimensional uniformity. Sampling properties of the proposed designs are derived. Examples are given for illustrating the construction method and corroborating the derived theoretical results. Potential applications of the constructed designs include uncertainty quantification of computer models, computer models with qualitative and quantitative factors, cross-validation and efficient allocation of computing resources. Supplementary materials for this article are available online.  相似文献   

12.
ABSTRACT

Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and, therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used to overcome this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the article, a special emphasis is given for a recently developed space-filling design called maximum projection design. Its advantages are illustrated using a simulation conducted for optimizing a milling process.  相似文献   

13.
Classical D‐optimal design is used to create experimental designs for situations in which an underlying system model is known or assumed known. The D‐optimal strategy can also be used to add additional experimental runs to an existing design. This paper demonstrates a study of variable choices related to sequential D‐optimal design and how those choices influence the D‐efficiency of the resulting complete design. The variables studied are total sample size, initial experimental design size, step size, whether or not to include center points in the initial design, and complexity of initial model assumption. The results indicate that increasing total sample size improves the D‐efficiency of the design, less effort should be placed in the initial design, especially when the true underlying system model isn't known, and it is better to start off with assuming a simpler model form, rather than a complex model, assuming that the experimenter can reach the true model form during the sequential experiments. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

14.
Traditional space-filling designs are a convenient way to explore throughout an input space of flexible dimension and have design points close to any region where future predictions might be of interest. In some applications, there may be a model connecting the input factors to the response(s), which provides an opportunity to consider the spacing not only in the input space but also in the response space. In this paper, we present an approach for leveraging current understanding of the relationship between inputs and responses to generate designs that allow the experimenter to flexibly balance the spacing in these two regions to find an appropriate design for the experimental goals. Applications where good spacing of the observed response values include calibration problems where the goal is to demonstrate the adequacy of the model across the range of the responses, sensitivity studies where the outputs from a submodel may be used as inputs for subsequent models, and inverse problems where the outputs of a process will be used in the inverse prediction for the unknown inputs. We use the multi-objective optimization method of Pareto fronts to generate multiple non-dominated designs with different emphases on the input and response space-filling criteria from which the experimenter can choose. The methods are illustrated through several examples and a chemical engineering case study.  相似文献   

15.
We propose ‘low‐cost response surface methods’ (LCRSMs) that typically require half the experimental runs of standard response surface methods based on central composite and Box Behnken designs, but yield comparable or lower modeling errors under realistic assumptions. In addition, the LCRSMs have substantially lower modeling errors and greater expected savings compared with alternatives with comparable numbers of runs, including small composite designs and computer‐generated designs based on popular criteria such as D‐optimality. The LCRSM procedures appear to be the first experimental design methods derived as the solution to a simulation optimization problem. Together with modern computers, simulation optimization offers unprecedented opportunities for applying clear, realistic multicriterion objectives and assumptions to produce useful experimental design methods. We compare the proposed LCRSMs with alternatives based on six criteria. We conclude that the proposed methods offer attractive alternatives when the experimenter is considering dropping factors to use standard response surface methods or would like to perform relatively few runs and stop with a second‐order model. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

16.
This paper considers an experimentation strategy when resource constraints permit only a single design replicate per time interval and one or more design variables are hard to change. The experimental designs considered are two‐level full‐factorial or fractional‐factorial designs run as balanced split plots. These designs are common in practice and appropriate for fitting a main‐effects‐plus‐interactions model, while minimizing the number of times the whole‐plot treatment combination is changed. Depending on the postulated model, single replicates of these designs can result in the inability to estimate error at the whole‐plot level, suggesting that formal statistical hypothesis testing on the whole‐plot effects is not possible. We refer to these designs as balanced two‐level whole‐plot saturated split‐plot designs. In this paper, we show that, for these designs, it is appropriate to use ordinary least squares to analyze the subplot factor effects at the ‘intermittent’ stage of the experiments (i.e., after a single design replicate is run); however, formal inference on the whole‐plot effects may or may not be possible at this point. We exploit the sensitivity of ordinary least squares in detecting whole‐plot effects in a split‐plot design and propose a data‐based strategy for determining whether to run an additional replicate following the intermittent analysis or whether to simply reduce the model at the whole‐plot level to facilitate testing. The performance of the proposed strategy is assessed using Monte Carlo simulation. The method is then illustrated using wind tunnel test data obtained from a NASCAR Winston Cup Chevrolet Monte Carlo stock car. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

17.
When fitting complex models, such as finite element or discrete event simulations, the experiment design should exhibit desirable properties of both projectivity and orthogonality. To reduce experimental effort, sequential design strategies allow experimenters to collect data only until some measure of prediction precision is reached. In this article, we present a batch sequential experiment design method that uses sliced full factorial-based Latin hypercube designs (sFFLHDs), which are an extension to the concept of sliced orthogonal array-based Latin hypercube designs (OALHDs). At all stages of the sequential design, good univariate stratification is achieved. The structure of the FFLHDs also tends to produce uniformity in higher dimensions, especially at certain stages of the design. We show that our batch sequential design approach has good sampling and fitting qualities through both empirical studies and theoretical arguments. Supplementary materials are available online.  相似文献   

18.
Response surface methodologies can reveal important features of complex computer code models. Here, we suggest experimental designs and interpolation methods for extracting nonlinear response surfaces whose roughness varies substantially over the input domain. A sequential design algorithm for cuboid domains is initiated by selecting an extended corner/centre point design for the entire domain, then updated by decomposing this domain into disjoint cuboids and taking the corners and centre of these cuboids as new design points. A roughness criterion is used to control the domain decomposition so that the design becomes space-filling and the coverage is particularly good in the parts of the input domain where the response surface is strongly nonlinear. Finally, the model output at untried inputs is predicted by carefully selecting a local neighbourhood of each new point in the input space and fitting a full quadratic polynomial to the data points in that neighbourhood. Test runs showed that our sequential design algorithm automatically adapts to the nonlinear features of the model output. Moreover, our technique is particularly useful for extracting nonlinear response surfaces from computer code models with two to seven input variables. A simple modification of the outlined algorithm enables adequate handling of non-cuboid input domains.  相似文献   

19.
The construction of decision-theoretical Bayesian designs for realistically complex nonlinear models is computationally challenging, as it requires the optimization of analytically intractable expected utility functions over high-dimensional design spaces. We provide the most general solution to date for this problem through a novel approximate coordinate exchange algorithm. This methodology uses a Gaussian process emulator to approximate the expected utility as a function of a single design coordinate in a series of conditional optimization steps. It has flexibility to address problems for any choice of utility function and for a wide range of statistical models with different numbers of variables, numbers of runs and randomization restrictions. In contrast to existing approaches to Bayesian design, the method can find multi-variable designs in large numbers of runs without resorting to asymptotic approximations to the posterior distribution or expected utility. The methodology is demonstrated on a variety of challenging examples of practical importance, including design for pharmacokinetic models and design for mixed models with discrete data. For many of these models, Bayesian designs are not currently available. Comparisons are made to results from the literature, and to designs obtained from asymptotic approximations. Supplementary materials for this article are available online.  相似文献   

20.
A challenge in engineering design is to choose suitable objectives and constraints from many quantities of interest, while ensuring an optimization is both meaningful and computationally tractable. We propose an optimization formulation that can take account of more quantities of interest than existing formulations, without reducing the tractability of the problem. This formulation searches for designs that are optimal with respect to a binary relation within the set of designs that are optimal with respect to another binary relation. We then propose a method of finding such designs in a single optimization by defining an overall ranking function to use in optimizers, reducing the cost required to solve this formulation. In a design under uncertainty problem, our method obtains the most robust design that is not stochastically dominated faster than a multiobjective optimization. In a car suspension design problem, our method obtains superior designs according to a k-optimality condition than previously suggested multiobjective approaches to this problem. In an airfoil design problem, our method obtains designs closer to the true lift/drag Pareto front using the same computational budget as a multiobjective optimization.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号