首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 672 毫秒
1.
Sequential tolerance control (STC) is a methodology that uses available measurement information at the completion of one manufacturing operation to position the set point for subsequent operations. It has been shown that STC can lead to inferior solutions when the manufacturing process distributions are skewed. This paper presents an adaptive sphere-fitting method (ASF-STC) that adjusts for such skewness. ASF-STC requires as inputs both the direction of skewness and the probability distribution parameters for each operation. Heuristic methods for estimating each of these inputs are presented. Through computational testing, ASF-STC is shown to offer significant improvements over STC when such skewness exists.  相似文献   

2.
The machining of complex parts typically involves a logical and chronological sequence of n operations on m machine tools. Because manufacturing datums cannot always match design datums, some of the design specifications imposed on the part are usually satisfied by distinct subsets of the n operations prescribed in the process plan. Conventional tolerance control specifies a fixed set point for each operation and permissible variation about this set point to insure compliance with the specifications. Sequential tolerance control (STC) uses real-time measurement information at the completion of one stage to exploit the available space inside a dynamic feasible zone and reposition the set point for subsequent operations. This paper introduces an extension of STC that utilizes the variability of the operations to scale the problem data and further enhance the ability of STC to optimize the production of an acceptable part.  相似文献   

3.
Sequential tolerance control (STC) is an approach that uses real-time measurement information at the completion of a stage to exploit the available space inside a dynamic feasible zone and reposition the set points for the remaining operations. STC has been shown to produce significantly higher yields than conventional tolerance control given constant equipment precision. STC was developed under the premise that a measurement and set point adjustment would follow each operation. However, measuring after each operation may not be practical under certain conditions. This study develops techniques for determining when measurements and set point adjustments should take place so that the benefits of STC are realized without interrupting the process after every operation.  相似文献   

4.
The machining of complex parts typically involves a logical and chronological sequence of n operations on m machine tools. Because manufacturing datums cannot always match design datums, some of the design specifications imposed on the part are usually satisfied by distinct subsets of the n operations prescribed in the process plan. Conventional tolerance control specifies a fixed set point for each operation and permissible variation about this set point to insure compliance with the specifications. This approach is inadequate for complex, low volume, highvalue added parts such as those found in the aircraft, nuclear, or precision instrument manufacturing industry. This paper introduces the concept of Sequential Tolerance Control, an approach that uses real-time measurement information at the completion of stage j to exploit available space inside a dynamic feasible zone and reposition the set point for operations j + 1 to n. The procedure is repeated at appropriate locations along the n operations so as to optimize the production of an acceptable part.  相似文献   

5.
Conventional process planning in manufacturing operations presents fixed process means and process tolerances for all operations and allows actual outputs to be distributed around these fixed values, as long as the final outputs fall within acceptable specifications. Some approaches attempt to maximize the process tolerances of all manufacturing operations for part production. Other approaches intend to minimize the tolerance cost or quality loss based on known functions. Most of them consider process mean and process tolerance as independent decision variables in process planning, with the condition that the resultant working dimensions are equal to the design target values of blueprint dimensions. These approaches assume that there is no process drift or deterioration. However, these conventional approaches are inappropriate for small‐volume, high‐value and precision processing, particularly of a complex part. Hence this study introduces an alternative approach to the tolerance‐balancing problem that does not provide specific objective functions, which determines process means and process tolerances simultaneously and adjusts them sequentially. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

6.
This paper proposes a Bayesian method to set tolerance or specification limits on one or more responses and obtain optimal values for a set of controllable factors. The existence of such controllable factors (or parameters) that can be manipulated by the process engineer and that affect the responses is assumed. The dependence between the controllable factors and the responses is assumed to be captured by a regression model fit from experimental data, where the data are assumed to be available. The proposed method finds the optimal setting of the control factors (parameter design) and the corresponding specification limits for the responses (tolerance control) in order to achieve a desired posterior probability of conformance of the responses to their specifications. Contrary to standard approaches in this area, the proposed Bayesian approach uses the complete posterior predictive distribution of the responses, thus the tolerances and settings obtained consider implicitly both the mean and variance of the responses and the uncertainty in the regression model parameters.  相似文献   

7.
Sequential tolerance control (STC) is a tolerance control methodology used in discrete parts manufacturing. Recently, an adaptive sphere‐fitting method for STC (ASF–STC) was developed to account for potential skewness in manufacturing operations' distributions, a factor not considered in conventional STC. ASF–STC offers significant improvements over conventional STC when such skewness exists. The direction of skewness of an operations' distribution is a necessary input to ASF–STC. Thus, a novel approach to determining the skewness of a distribution for small sample sizes is presented here. ASF–STC has an additional requirement of distribution information for each operation. The beta distribution is an ideal candidate here, as it is very flexible in shape. The literature on four‐parameter beta estimation is very limited, and their performance for small sample sizes is poor. STC was designed for low‐volume production, thus the estimation for small sample sizes is necessary here. This study presents a heuristic, based on the method of moments estimates for a beta distribution, that estimates the four parameters for a beta distribution with small sample size. Several computational results are provided to compare this heuristic to the best‐known procedure, with the heuristic found to perform better for the test problems considered. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

8.
An acceptance control chart (ACC) is used to monitor certain processes, in which the natural dispersion of the process is much less than the specified tolerance in design. Typical examples are the processes subject to tool wear. This article presents an adaptive acceptance control chart (AACC). Its sample size can be adjusted during the process control, so that the average number of measurements may be significantly reduced. AACC can also minimize a linear combination of the type I and type II errors, based on the user's specifications. A computer program has been developed to aid the off-line design of AACC.  相似文献   

9.
Monitoring multivariate quality variables or data streams remains an important and challenging problem in statistical process control (SPC). Although the multivariate SPC has been extensively studied in the literature, designing distribution-free control schemes are still challenging and yet to be addressed well. This article develops a new nonparametric methodology for monitoring location parameters when only a small reference dataset is available. The key idea is to construct a series of conditionally distribution-free test statistics in the sense that their distributions are free of the underlying distribution given the empirical distribution functions. The conditional probability that the charting statistic exceeds the control limit at present given that there is no alarm before the current time point can be guaranteed to attain a specified false alarm rate. The success of the proposed method lies in the use of data-dependent control limits, which are determined based on the observations online rather than decided before monitoring. Our theoretical and numerical studies show that the proposed control chart is able to deliver satisfactory in-control run-length performance for any distributions with any dimension. It is also very efficient in detecting multivariate process shifts when the process distribution is heavy-tailed or skewed. Supplementary materials for this article are available online.  相似文献   

10.
The determination of tolerance allocations among design parameters is an integral phase of product/process design. Such allocations are often necessary to achieve desired levels of product performance. We extend our prior research on tolerance allocation by developing both parametric and nonparametric methods for a multivariate set of performance measures that are functions of a common set of design parameters. The parametric method is novel and assumes full information about the probability distribution of design parameter processes. The proposed nonparametric method assumes that only partial information is available and significantly extends prior research by considering a more contemporary and realistic model for manufacturer costs. For both methods we derive economically based models that represent the costs, both internal (supplier) and external (manufacturer), of tolerance allocation under several different process scenarios. These scenarios are based on the manner of disposition of nonconforming product. For the parametric methods we derive tolerance allocation solutions that jointly minimize expected total cost of the supplier and manufacturer. For the nonparametric methods we derive solutions for tolerance allocation that jointly minimizes the maximum expected total cost. An example in the fabrication of a rubber tread compound is used to: (i) demonstrate the implementation of our proposed methodologies for tolerance allocation; (ii) illustrate and compare the nonparametric and parametric methods; land iii) assess the sensitivity of optimal tolerance allocations to changes in process model types, cost coefficient estimates, and manner of disposition of nonconforming product.  相似文献   

11.
Monitoring disturbances in process dispersion using control chart is mostly based on the assumption that the quality characteristic follows normal distribution, which is not the case in many real-life situations. This paper proposes a set of new dispersion charts based on the homogeneously weighted moving average (HWMA) scheme, for efficient detection of shifts in process standard deviation (σ). These charts are based on a variety of σ estimators and are investigated for normal as well as heavy tailed symmetric and skewed distributions. The shift detection ability of the charts is evaluated using different run length characteristics, such as average run length (ARL), extra quadratic loss (EQL), and relative ARL measures. The performance of the proposed HWMA control charts is also compared with the existing EWMA dispersion charts, using different design parameters. Furthermore, an illustrative example is presented to monitor the vapor pressure in a distillation process.  相似文献   

12.
In mechanical design, tolerance assignment is a critical but complex task since the designer will not only have to consider the associated cost to achieve a certain tolerance level, but also the cost due to failure in assembling task. These associated tolerance costs as well as the failure rate are fuzzy in nature. This paper presents an integrated approach to incorporate manufacturing costs of certain tolerance specifications into design stages for automatic tolerance assignment and design. Tolerance design is interdisciplinary in nature and is characterized by a highly uncertain environment. In recent years, fuzzy logic has appeared as a credible alternative for tolerance design. A fuzzy based tolerance representation scheme is presented to model three dimensional (3D) tolerances. With this representation, relative assembly tolerance constraints can be expressed. A fuzzy tolerance generation and assignment process for assembly is discussed. Fuzzy tolerance equations are generated for 3D assembly considerations. Manufacturing process information, along with uncertain cost information modelled in fuzzy terms, is added to the system to arrive at a cost-optimal tolerance assignment.  相似文献   

13.
Operational dimensioning and tolerancing play an important role in process planning. It ensures that resultant part dimensions and tolerances do not exceed specified design values. Tolerance chart analysis is an effective technique for process planners to calculate mean values and tolerances of operational dimensions. However, a tolerance chart can be built only after all the intial engineering decisions have been made concerning the process plan. Specifically, setups and setup datums should be identified. While many researchers focused their attention on tolerance chart analysis, the selection of setups and setup datums (setup planning) was overlooked. No systematic approaches for setup planning can be found in the literature. This paper discusses the importance of setup planning to tolerance control in process planning. A graphical approach is then proposed to generate optimal setup plans based on design tolerance specifications.  相似文献   

14.
A simultaneous consideration of process mean and variance in product design stages has been considered one of the most significant of Taguchi's contributions. Among his quality improvement methods, parameter design has drawn a great deal of particular attention from researchers. The ultimate objective of Taguchi's parameter design is to find control factor settings to achieve an on‐target process mean with a minimum variance. There is no doubt regarding the virtue of the minimum variance. However, considering a variety of economic aspects related to product specifications as well as a quality loss, the on‐target process mean may not necessarily be economical. This paper investigates the parameter design problem from an economic point of view and proposes an alternative procedure to achieve the most economical process mean as well as the minimum variance by taking product specifications and an asymmetric quality loss into consideration. It is shown through an illustrative example that a significant cost saving can be accrued from the proposed procedure. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

15.
Realistic circuit design requires that unavoidable tolerances on component parameters be taken into account, particularly in situations where a circuit is to be mass-produced. Since specifications are normally imposed on circuit performance, parameter tolerances can have the undesirable effect of reducing manufacturing yield (i.e. the percentage of circuits which meet specifications) to values below unity, thereby effectively increasing circuit cost. Approaches have been developed to electrical circuit design which incorporate aspects of parameter tolerance variations at the various stages of design, thus enabling tolerance effects to be assessed and minimized. There are two principal approaches: statistical and deterministic. The first uses probabilistic techniques to predict variations in circuit performance, whereas the second uses deterministic (i.e. non-stochastic) methods. Within each group, three types of problems are important: first, the maximization of yield, secondly, the minimization of circuit unit cost and, thirdly, the minimization of performance variability. This paper discusses some important advances in the statistical approach to tolerance design. Monte Carlo analysis is almost invariably an important component of the procedure: random fluctuations in parameter values are simulated according to some probability density function and inserted into a computer circuit simulation program which computes corresponding circuit performance variations. The procedure — also referred to as tolerance analysis — not only allows the designer to predict expected performance fluctuations but also presents him with information regarding the relative location of acceptable and non-acceptable circuits in component parameter space. The Monte Carlo method can handle without difficult any number of component parameters and performance functions; moreover, statistical dependence among parameters is readily handled. The algorithm presented here is experimentally validated through successful design of practical circuits and is applicable to both discrete and integrated circuits. Strategies which ensure computational efficiency of the methods are discussed and a cost/benefit analysis carried out for a typical circuit.  相似文献   

16.
In this paper we demonstrate both a design strategy and a set of analysis techniques for a designed experiment from an industrial process (cheese making) with multivariate responses (sensory data). The design strategy uses two-level factorial design for the factors that can be controlled, and blocking on the raw material to cover other non-designed variation in the raw material. We measure both the raw materials and on several points during the process with FT-IR spectroscopy. The methods of analysis complement each other to give more understanding and better modelling. The 50–50 MANOVA method provides multivariate analysis of variance to test for significance of effects for the design variables. Ordinary PLS2 analysis gives an overview of the data and generates hypotheses about relations. Finally, the orthogonal LS–PLS method is extended to multivariate responses and used to identify the source of the observed block effect and to build models that can be used for statistical process control at several points in the process. In these models, the information at one point is corrected for information that has already been described elsewhere.  相似文献   

17.
Recent Food and Drug Administration (FDA) validation guidelines and comments indicate that applying finished product content uniformity specifications to blend testing is unacceptable. The scenario the FDA has presented is one in which disorder increases as the process progresses so that blend test specifications should be more restrictive (tighter) than finished product testing specifications. In other publications, it has been suggested that finished product assay limits be applied as a blend specification along with a lower relative standard deviation value than the current USP content uniformity limit (6.0%). This approach is questionable since assay results are applied to an aggregate finished product sample rather than individual doses. A method is presented in this paper for applying statistical tolerance limits (Sib) to blend data. This procedure provides a 95% confidence level that at least 90% of the values for the entire population are within the calculated limits. These statistical tolerance limits provide an acceptable criterion that is statistically tighter than the application of USP XXIII finished product content uniformity specifications. In addition, this method involves a decision process or multiple-level evaluation based on a statistical comparison of the variance and mean for the blend and finished product. In cases where the calculated STLs are unacceptable, the decision process allows for determining if the out-of-specification values from the first level of testing are due to a true blend failure or if the cause of the aberration is due to other phenomena, which could include sampling technique, thief design, and analytical testing problems.  相似文献   

18.
Process capability indices are considered to be one of the important quality measurement tools for the continuous improvement of quality and productivity. The most commonly used indices assume that process data are normally distributed. However, many studies have pointed out that the normally‐based indices are very sensitive to non‐normal processes. Therefore we propose a new process capability index applying the weighted variance control charting method for non‐normal processes to improve the measurement of process performance when the process data are non‐normally distributed. The main idea of the weighted variance method is to divide a skewed or asymmetric distribution into two normal distributions from its mean to create two new distributions which have the same mean but different standard deviations. In this paper we provide an example, a distribution generated from the Johnson family of distributions, to demonstrate how the weighted variance‐based process capability indices perform in comparison with another two non‐normal methods, namely the Clements and Johnson–Kotz–Pearn methods. This example shows that the weighted variance‐based indices are more consistent than the other two methods in estimating process fallout for non‐normal processes. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

19.
Among a set of tools that form the core of statistical process control, statistical control charts are most commonly used for controlling, monitoring, and improving processes. The conventional control charts are based on the assumption that the distribution of the quality characteristic to be monitored follows the normal distribution. However, in real applications, many process distributions may follow a positively skewed distribution such as the lognormal distribution. In this study, we discuss the construction of several control charts for monitoring the mean of the lognormal distribution. A real example is used to demonstrate how these charts can be applied in practice. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

20.
Abstract

Recent Food and Drug Administration (FDA) validation guidelines and comments indicate that applying finished product content uniformity specifications to blend testing is unacceptable. The scenario the FDA has presented is one in which disorder increases as the process progresses so that blend test specifications should be more restrictive (tighter) than finished product testing specifications. In other publications, it has been suggested that finished product assay limits be applied as a blend specification along with a lower relative standard deviation value than the current USP content uniformity limit (6.0%). This approach is questionable since assay results are applied to an aggregate finished product sample rather than individual doses. A method is presented in this paper for applying statistical tolerance limits (Sib) to blend data. This procedure provides a 95% confidence level that at least 90% of the values for the entire population are within the calculated limits. These statistical tolerance limits provide an acceptable criterion that is statistically tighter than the application of USP XXIII finished product content uniformity specifications. In addition, this method involves a decision process or multiple-level evaluation based on a statistical comparison of the variance and mean for the blend and finished product. In cases where the calculated STLs are unacceptable, the decision process allows for determining if the out-of-specification values from the first level of testing are due to a true blend failure or if the cause of the aberration is due to other phenomena, which could include sampling technique, thief design, and analytical testing problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号