首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This modest paper contains a much more concise representation of the popular experimental design generators for fractional factorial designs. The information is indexed by the information that the experimenter will start with: the number of factors and the desired resolution for the experiment. Its format facilitates the exploration of the effect of changing the experimental design by using a different number of factors or by changing the experiment resolution.  相似文献   

2.
3.
4.
5.
6.
7.
8.
Design of experiments is a quality technology to achieve product excellence, that is to achieve high quality at low cost. It is a tool to optimize product and process designs, to accelerate the development cycle, to reduce development costs, to improve the transition of products from R & D to manufacturing and to troubleshoot manufacturing problems effectively. It has been successfully, but sporadically, used in the United States. More recently, it has been identified as a major technological reason for the success of Japan in producing high-quality products at low cost. In the United States, the need for increased competitiveness and the emphasis on quality improvement demands a widespread use of design of experiments by engineers, scientists and quality professionals. In the past, such widespread use has been hampered by a lack of proper training and a lack of availability of tools to easily implement design of experiments in industry. Three steps are essential, and are being taken, to change this situation dramatically. First, simple graphical methods, to design and analyse experiments, need to be developed, particularly when the necessary microcomputer resources are not available. Secondly, engineers, scientists and quality professionals must have access to microcomputer-based software for design and analysis of experiments.1 Availability of such software would allow users to concentrate on the important scientific and engineering aspects of the problem by computerizing the necessary statistical expertise. Finally, since a majority of the current workforce is expected to be working in the year 2000, a massive training effort, based upon simple graphical methods and appropriate computer software, is necessary.2 The purpose of this paper is to describe a methodology based upon a new graphical method called interaction graphs and other previously known techniques, to simplify the correct design of practically important fractional factorial experiments. The essential problem in designing a fractional factorial experiment is first stated. The interaction graph for a 16-trial fractional factorial design is given to illustrate how the graphical procedure can be easily used to design a two-level fractional factorial experiment. Other previously known techniques are described to easily modify the two-level fractional factorial designs to create mixed multi-level designs. Interaction graphs for other practically useful fractional factorial designs are provided. A computer package called CADE (computer aided design of experiments), which automatically generates the appropriate fractional factorial designs based upon user specifications of factors, levels and interactions and conducts complete analyses of the designed experiments is briefly described.1 Finally, the graphical method is compared with other available methods for designing fractional factorial experiments.  相似文献   

9.
This case study illustrates how statistical techniques help improve the quality of a manufacturing process. The study shows how well designed experiments advance one's understanding of the underlying process, and it emphasizes the importance of experimentation for attaining knowledge. The investigation involves the production of a certain viscose fiber. The results of a simple fractional factorial experiment with ten factors in 32 runs point to useful strategies for improving fiber strength and fiber elongation. The study also investigates the relationship between the strength and the thickness of the fiber.  相似文献   

10.
This article provides an outline of theory and methods for the experimental determination of tolerance limits for mating components of assembled products. The emphasis is on novel combinatorial problems of pre- and post-fracfionafion of certain products of two-level factorial designs. The cost of experimentation is discussed and used as a guide to allocating experimental runs. Several design examples are provided. The article also includes a comprehensive example of the experimental determination of tolerances for the components of a throttle handle for small outboard motors.  相似文献   

11.
12.
Design of experiments (DOE) is a powerful approach for discovering a set of process (or design) variables which are most important to the process and then determine at what levels these variables must be kept to optimize the response (or quality characteristic) of interest. This paper presents two catapult experiments which can be easily taught to engineers and managers in organizations to train for design of experiments. The results of this experiment have been taken from a real live catapult experiment performed by a group of engineers in a company during the training program on DOE. The first experiment was conducted to separate out the key factors (or variables) from the trivial and the second experiment was carried out using the key factors to understand the nature of interactions among the key factors. The results of the experiment were analysed using simple but powerful graphical tools for rapid and easier understanding of the results to engineers with limited statistical competency. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

13.
Statisticians typically recommend completely randomized experimental designs. The reasoning behind this advice is theoretically sound. Unfortunately, engineers who typically run industrial experiments frequently fail to recognize restrictions on randomization, i.e., split-plot experiments, and are often unaware of the risks associated with analyzing split-plot experiments as if they were randomized. Similarly, issues concerning the inference space of the experiment frequently are not given adequate consideration. Conversely, statisticians frequently are unaware that a restriction on randomization does not necessarily translate into less information than a completely randomized design.

In this paper, we discuss a proactive methodology for identifying and incorporating information concerning restrictions on randomization and inference space in industrial experiments. We also present the factor relationship diagram (FRD), a tool that assists engineers in the recognition of restrictions of randomization and guides the development of questions that encourage the experimenter to understand those sources of variation that may contribute to a lack of precision in a split-plot experiment or lack of repeatability in inference space different from that studied in the experiment. Examples that illustrate the use of the methodology and the FRD are included.  相似文献   

14.
The aim of the present study was to demonstrate the application of an automated high-throughput (HT) dissolution method as a useful screening tool for characterization of controlled release pellets in the formulation development phase. Five controlled release pellet formulations with drug substances exhibiting high or low solubility were chosen to investigate the correlation of the automated HT dissolution method with the conventional dissolution testing. Overall, excellent correlations (R2?>?0.96) between the HT and the conventional dissolution method were obtained. In one case the initial unsatisfactory correlation (R2?=?0.84) and poor method agreement (SD?=?12.5) was improved by optimizing the HT dissolution method with design of experiment approach. Here in comparison to initial experimental HT dissolution settings, increased amount of pellets (25% of the capsule filling mass), lower temperature (22?°C) and no shaking resulted in significantly better correlation (R2?=?0.97) and method agreement (SD?=?5.3). These results show that such optimization is valuable for the development of HT dissolution methods. In conclusion, the high correlation of dissolution profiles obtained from the conventional and the automated HT dissolution method combined with low within-sample and measurement system variability, justifies the utilization of the automated HT dissolution method during development phase of controlled release pellets.  相似文献   

15.
Nanofluids have been introduced as new-generation fluids able to improve energy efficiency in heat exchangers. However, stability problems related to both agglomeration and sedimentation of nanoparticles have limited industrial-level scaling. A fractional factorial experimental 2k?1 design was applied in order to evaluate the effects of nanoparticle concentration, surfactant type and concentration, ultrasonic amplitude as well as ultrasonic time on the stability of alumina (Al2O3) nanofluids. Commercial alumina nanoparticles (particle diameter <50 nm) were dispersed in deionized water using ultrasonic probe dispersion equipment. Sodium dodecylbenzenesulfonate (SDBS) and cetyltrimethylammonium bromide (CTAB) were used as surfactants. The stability of the nanofluids in static mode was monitored by visual inspection and UV visible spectroscopy. The results of the experimental design showed that the coupled effects between surfactant type and surfactant concentration and between ultrasonication tip amplitude and ultrasonication time had the most pronounced effects on nanofluid stability. The experimental conditions providing the best stability were 0.5 wt% of Al2O3, CTAB, critical micelle surfactant concentration, 30% ultrasonic amplitude and 30 min of ultrasonication.  相似文献   

16.
Replicating runs in designed experiments is good practice. The most important reason to replicate runs is to allow for a model-independent estimate of the error variance. Without the pure error degrees of freedom provided by replicated runs, the error variance will be biased if the fitted model is missing an active effect. This work provides a replication strategy for full-factorial designs having two to four factors. However, our approach is general and could be applied to any factorial experiment.  相似文献   

17.
Robust design is an important method for improving product manufacturability and life, and for increasing manufacturing process stability and yield. In 1980 Genichi Taguchi introduced his approach to using statistically planned experiments in robust product and process design to U.S. industry. Since then, the robust design problem and Taguchi's approach to solving it has received much attention from product designers, manufacturers, statisticians and quality professionals. Although most agree on the importance of the robust design problem, controversy over some of the specific methods used to solve the problem has made this an active research area. Although the answers are not all in yet, the importance of the problem has led to development of a four-step methodology for implementing robust design. The steps are (1) formulate the problem by stating objectives and then listing and classifying product or process variables, (2) plan an experiment to study these variables, (3) identify improved settings of controllable variables from the experiment's results and (4) confirm the improvement in a small follow-up experiment. This paper presents a methodology for the problem formulation and experiment planning steps. We give practical guidelines for making key decisions in these two steps, including choice of response characteristics, and specification of interactions and test levels for variables. We describe how orthogonal arrays and interaction graphs can be used to simplify the process of planning an experiment. We also compare the experiment planning strategies we are recommending to those of Taguchi and to more traditional approaches.  相似文献   

18.
To assess the reliability of a complex system, many different types of data may be available. Full‐system tests are the most direct measure of reliability, but may be prohibitively expensive or difficult to obtain. Other less direct measures, such as component or section level tests, may be cheaper to obtain and more readily available. Using a single Bayesian analysis, multiple sources of data can be combined to give component and system reliability estimates. Resource allocation looks to develop methods to predict which new data would most improve the precision of the estimate of system reliability, in order to maximally improve understanding. In this paper, we consider a relatively simple system with different types of data from the components and system. We present a methodology for assessing the relative improvement in system reliability estimation for additional data from the various types. Various metrics for comparing improvement and a response surface approach to modeling the relationship between improvement and the additional data are presented. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

19.
20.
研究应用决策的期望价值(Expected Monetary Value简记为EMV)准则和完整信息价值(Expecxted Value of Perfect Information简记为EVPI)准则求解概率论中摸球问题,将EMV膳食和EVPI准则与Bayes方法有机结合,给出了这一摸球问题完整的决策论方法解答。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号