首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 703 毫秒
1.
Mixed‐level designs are employed when factors with different numbers of levels are involved. Practitioners use mixed‐level fractional factorial designs as the total number of runs of the full factorial increases rapidly as the number of factors and/or the number of factor levels increases. One important decision is to determine which fractional designs should be chosen. A new criterion, the general balance metric (GBM), is proposed to evaluate and compare mixed‐level fractional factorial designs. The GBM measures the degree of balance for both main effects and interaction effects. This criterion is tied to, and dominates orthogonality criteria as well as traditional minimum aberration criteria. Furthermore, the proposal is easy to use and has practical interpretations. As part of the GBM, the concept of resolution is generalized and the confounding structure of mixed‐level fractional factorial designs is also revealed. Moreover, the metric can also be used for the purpose of design augmentation. Examples are provided to compare this approach with existing criteria. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

2.
Screening experiments are typically used when attempting to identify a few active factors in a larger pool of potentially significant factors. In general, two‐level regular factorial designs are used, but Plackett–Burman (PB) designs provide a useful alternative. Although PB designs are run‐efficient, they confound the main effects with fractions of strings of two‐factor interactions, making the analysis difficult. However, recent discoveries regarding the projective properties of PB designs suggest that if only a few factors are active, the original design can be reduced to a full factorial, with additional trials frequently forming attractive patterns. In this paper, we show that there is a close relationship between the partial confounding in certain PB designs and their projective properties. With the aid of examples, we demonstrate how this relationship may help experimenters better appreciate the use of PB designs. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

3.
Most two‐level fractional factorial designs used in practice involve independent or fully confounded effects (so‐called regular designs). For example, for 16 runs and 6 factors, the classical resolution IV design with defining relation I = ABCE = BCDF = ADEF has become the de facto gold standard. Recent work has indicated that non‐regular orthogonal designs could be preferable in some circumstances. Inhibiting a wider usage of these non‐regular designs seems to be a combination of inertia/status quo and perhaps the general resistance and suspicion to designs that are computer generated to achieve ‘XYZ’ optimality. In this paper each of the orthogonal non‐isomorphic two‐level, 16 run designs with 6, 7, or 8 factors (both regular and non‐regular) are shown to have a classical‐type construction with a 24 or a replicated 23 starting point. Additional factor columns are defined either using the familiar one‐term column generators or generators using weighted sums of effects. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

4.
Nonregular designs are a preferable alternative to regular resolution IV designs because they avoid confounding two-factor interactions. As a result nonregular designs can estimate and identify a few active two-factor interactions. However, due to the sometimes complex alias structure of nonregular designs, standard screening strategies can fail to identify all active effects. In this paper, we explore a specific no-confounding six-factor 16-run nonregular design with orthogonal main effects. By utilizing our knowledge of the alias structure, we can inform the model selection process. Our aliased informed model selection (AIMS) strategy is a design-specific approach that we compare to three generic model selection methods; stepwise regression, Lasso, and the Dantzig selector. The AIMS approach substantially increases the power to detect active main effects and two-factor interactions versus the aforementioned generic methodologies.  相似文献   

5.
This paper considers an experimentation strategy when resource constraints permit only a single design replicate per time interval and one or more design variables are hard to change. The experimental designs considered are two‐level full‐factorial or fractional‐factorial designs run as balanced split plots. These designs are common in practice and appropriate for fitting a main‐effects‐plus‐interactions model, while minimizing the number of times the whole‐plot treatment combination is changed. Depending on the postulated model, single replicates of these designs can result in the inability to estimate error at the whole‐plot level, suggesting that formal statistical hypothesis testing on the whole‐plot effects is not possible. We refer to these designs as balanced two‐level whole‐plot saturated split‐plot designs. In this paper, we show that, for these designs, it is appropriate to use ordinary least squares to analyze the subplot factor effects at the ‘intermittent’ stage of the experiments (i.e., after a single design replicate is run); however, formal inference on the whole‐plot effects may or may not be possible at this point. We exploit the sensitivity of ordinary least squares in detecting whole‐plot effects in a split‐plot design and propose a data‐based strategy for determining whether to run an additional replicate following the intermittent analysis or whether to simply reduce the model at the whole‐plot level to facilitate testing. The performance of the proposed strategy is assessed using Monte Carlo simulation. The method is then illustrated using wind tunnel test data obtained from a NASCAR Winston Cup Chevrolet Monte Carlo stock car. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
In many experimental situations, practitioners are confronted with costly, time consuming, or hard‐to‐change (HTC) factors. These practical or economic restrictions on randomization can be accommodated with a split‐plot design structure that minimizes the manipulation of the HTC factors. Selecting a good design is a challenging task and requires knowledge of the opportunities and restrictions imposed by the experimental apparatus and an evaluation of statistical performance among competing designs. Building on the well‐established evaluation criteria for the completely randomized context, we emphasize the unique qualitative and quantitative evaluation criteria for split‐plot designs. An example from hypersonic propulsion research is used to demonstrate the consideration of multiple design evaluation criteria. Published in 2007 by John Wiley & Sons, Ltd.  相似文献   

7.
Two‐level factorial designs in blocks of size two are useful in a variety of experimental settings, including microarray experiments. Replication is typically used to allow estimation of the relevant effects, but when the number of factors is large this common practice can result in designs with a prohibitively large number of runs. One alternative is to use a design with fewer runs that allows estimation of both main effects and two‐factor interactions. Such designs are available in full factorial experiments, though they may still require a great many runs. In this article, we develop fractional factorial design in blocks of size two when the number of factors is less than nine, using just half of the runs needed for the designs given by Kerr (J Qual. Tech. 2006; 38 :309–318). Two approaches, the orthogonal array approach and the generator approach, are utilized to construct our designs. Analysis of the resulting experimental data from the suggested design is also given. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
Second‐order experimental designs are employed when an experimenter wishes to fit a second‐order model to account for response curvature over the region of interest. Partition designs are utilized when the output quality or performance characteristics of a product depend not only on the effect of the factors in the current process, but the effects of factors from preceding processes. Standard experimental design methods are often difficult to apply to several sequential processes. We present an approach to building second‐order response models for sequential processes with several design factors and multiple responses. The proposed design expands current experimental designs to incorporate two processes into one partitioned design. Potential advantages include a reduction in the time required to execute the experiment, a decrease in the number of experimental runs, and improved understanding of the process variables and their influence on the responses. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

9.
When two-level fractional factorial designs are blocked, the application of the standard definition of resolution requires careful consideration. Sometimes linear contrasts that superficially appear to be estimates of higher-order interaction effects are in reality estimates of first-order effects. Experimenters may therefore inadvertently choose designs that are of lower resolution than intended or unknowingly confound important effects. In this note, I discuss this subtle problem and propose an additional rule to the usual definition of resolution that provides a conservative but more realistic assessment of the resolution. With this more realistic characterization, experimenters are provided with a warning about possible confounding. I also show that my amendment to the definition of resolution may be useful when characterizing designs in which several two-level contrasts are combined to accommodate factors with four or more levels  相似文献   

10.
The D‐optimality criterion is often used in computer‐generated experimental designs when the response of interest is binary, such as when the attribute of interest can be categorized as pass or fail. The majority of methods in the generation of D‐optimal designs focus on logistic regression as the base model for relating a set of experimental factors with the binary response. Despite the advances in computational algorithms for calculating D‐optimal designs for the logistic regression model, very few have acknowledged the problem of separation, a phenomenon where the responses are perfectly separable by a hyperplane in the design space. Separation causes one or more parameters of the logistic regression model to be inestimable via maximum likelihood estimation. The objective of this paper is to investigate the tendency of computer‐generated, nonsequential D‐optimal designs to yield separation in small‐sample experimental data. Sets of local D‐optimal and Bayesian D‐optimal designs with different run (sample) sizes are generated for several “ground truth” logistic regression models. A Monte Carlo simulation methodology is then used to estimate the probability of separation for each design. Results of the simulation study confirm that separation occurs frequently in small‐sample data and that separation is more likely to occur when the ground truth model has interaction and quadratic terms. Finally, the paper illustrates that different designs with identical run sizes created from the same model can have significantly different chances of encountering separation.  相似文献   

11.
Orthogonal arrays (OA's) are widely used in design of experiments. Each OA has a specific number of rows that is fixed by the number of factors in the OA and the number of levels in each factor. In a practical application of an industrial experiment, however, because of various operational constraints it could happen that the number of runs of the experiment cannot be set exactly equal to the number of rows of an OA. In this case, a lean design can be used. A lean design is obtained by removing some specific rows and columns from the extended design matrix formed from an OA, so that the resulting sub‐matrix still allows efficient estimation of the effects of some of the factors. Tables for 2‐level lean designs are already available in the literature. In this paper, the authors will investigate 3‐level lean designs and mixed‐level lean designs, and construct tables for such designs for convenient use. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

12.
The sequential design approach to response surface exploration is often viewed as advantageous as it provides the opportunity to learn from each successive experiment with the ultimate goal of determining optimum operating conditions for the system or process under study. Recent literature has explored factor screening and response surface optimization using only one three‐level design to handle situations where conducting multiple experiments is prohibitive. The most straightforward and accessible analysis strategy for such designs is to first perform a main‐effects only analysis to screen important factors before projecting the design onto these factors to conduct response surface exploration. This article proposes the use of optimal designs with minimal aliasing (MA designs) and demonstrates that they are more effective at screening important factors than the existing designs recommended for single‐design response surface exploration. For comparison purposes, we construct 27‐run MA designs with up to 13 factors and demonstrate their utility using established design criterion and a simulation study. Copyright 2011 © John Wiley & Sons, Ltd.  相似文献   

13.
The average prediction variance for an I‐optimal design for a specified normal theory linear model decreases nonlinearly with respect to sample size. In this paper, we develop a prediction equation to explain the relationship between average prediction variance and sample size. We investigate methods for determining what sample size is efficient for a given experiment using the average prediction variance (APV) versus sample size curves. The sample size determination is studied assuming a variety of cost structures for the trials in each experiment. For example, in practice, the length of time before an experiment is complete may be considered an implicit cost of experimentation. We provide results for designs and models based on two to five factors. We also present a potential application of the methods using a military system experiment.  相似文献   

14.
The output quality or performance characteristics of a product often depend not only on the effect of the factors in the current process but on the effect of factors from preceding processes. Statistically‐designed experiments provide a systematic approach to study the effects of multiple factors on process performance by offering a structured set of analyses of data collected through a design matrix. One important limitation of experimental design methods is that they have not often been applied to multiple sequential processes. The objective is to create a first‐order experimental design for multiple sequential processes that possess several factors and multiple responses. The first‐order design expands the current experimental designs to incorporate two processes into one partitioned design. The designs are evaluated on the complexity of the alias structure and their orthogonality characteristics. The advantages include a decrease in the number of experimental design runs, a reduction in experiment execution time, and a better understanding of the overall process variables and their influence on each of the responses. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

15.
《技术计量学》2013,55(3):418-431
This article concerns adaptive experimentation as a means for making improvements in design of engineering systems. A simple method for experimentation, called “adaptive one-factor-at-a-time,” is described. A mathematical model is proposed and theorems are proven concerning the expected value of the improvement provided and the probability that factor effects will be exploited. It is shown that adaptive one-factor-at-a-time provides a large fraction of the potential improvements if experimental error is not large compared with the main effects and that this degree of improvement is more than that provided by resolution III fractional factorial designs if interactions are not small compared with main effects. The theorems also establish that the method exploits two-factor interactions when they are large and exploits main effects if interactions are small. A case study on design of electric-powered aircraft supports these results.  相似文献   

16.
We consider an all‐subsets regression method for models under effect heredity restrictions for experimental designs with complex aliasing, whose number of potential main effects and two‐factor interactions exceed the number of runs. In this paper, we present an algorithm that systematically attempts to fit all such models. We illustrate the algorithm with two published experiments. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

17.
Foldover is a commonly used follow-up strategy in experimental designs. All existing foldover designs were constructed by reversing the sign of columns of the initial design. We propose a new methodology by allowing the permutation of columns in foldover. Focusing on resolution IV designs, we show that almost all designs are better than existing results with respect to the minimum aberration criterion. While augmenting a design by a foldover with column permutations may result in a nonregular combined design, the proposed designs all have a resolution of 4.5 or higher, for which no two-factor interaction is fully aliased with any other two-factor interactions.  相似文献   

18.
Comparisons between different designs have traditionally focused on balancing the quality of estimation or prediction relative to the overall size of the design. For split‐plot designs with two levels of randomization, the total number of observations may not accurately summarize the true cost of the experiment, because different costs are likely associated with setting up the whole and subplot levels. In this paper, we present several flexible measures for design assessment based on D‐, G‐ and V‐optimality criteria that take into account potentially different cost structures for the split‐plot designs. The new measures are illustrated with two examples: a 23 factorial experiment for first‐order models, where all possible designs are considered, and selective designs for a three‐factor second‐order model. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

19.
Alphabetic optimality criteria, such as the D, A, and I criteria, require specifying a model to select optimal designs. They are not model‐free, and the designs obtained by them may not be robust. Recently, many extensions of the D and A criteria have been proposed for selecting robust designs with high estimation efficiency. However, approaches for finding robust designs with high prediction efficiency are rarely studied in the literature. In this paper, we propose a compound criterion and apply the coordinate‐exchange 2‐phase local search algorithm to generate robust designs with high estimation, high prediction, or balanced estimation and prediction efficiency for projective submodels. Examples demonstrate that the designs obtained by our method have better projection efficiency than many existing designs.  相似文献   

20.
Saturated fractional factorial experimental designs and orthogonal main effect plans are extremely valuable tools in quality engineering. However, one problem with these designs is that there are no replicate runs to be used for estimating experimental error. This note develops an estimator of the experimental error based on the hypothesis that not all factor effects will be non-zero. A joint Bayesian prior distribution is presented for the experimental error variance of an effect, σ2, and the probability that each effect is non-zero. From this prior distribution a posterior marginal distribution for σ2 is derived along with a direct estimate of σ2. This method is compared with the traditional methods of estimating σ2 in unreplicated designs through a numerical example.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号