首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
    
Multivariate process capability indices (MPCIs) have been proposed to measure multivariate process capability in real-world application over the past three decades. For the practitioner's point of view, the intention of this paper is to examine the performances and distributional properties of probability-based MPCIs. Considering issues of construction of capability indices in multivariate setup and computation with performance, we found that probability-based MPCIs are a proper generalization of univariate basic process capability indices (PCIs). In the beginning of this decade, computation of probability-based indices was a difficult and time-consuming task, but in the computer age statistics, computation of probability-based MPCIs is simple and quick. Recent work on the performance of MPCI NMCpm and distributional properties of its estimator reasonably recommended this index, for use in practical situations. To study distributional properties of natural estimators of probability-based MPCIs and recommended index estimator, we conducted simulation study. Though natural estimators of probability-based indices are negatively biased, they are better with respect to mean, relative bias, mean square error. Probability-based MPCI MCpm is better as compared with NMCpm with respect to performance and as its estimator quality. Hence, in real-world practice, we recommend probability-based MPCIs as a multivariate analogue of basic PCIs.  相似文献   

2.
    
Process capability indices evaluate the capability of the processes in satisfying customer's requirements. This paper introduces a superstructure multivariate process incapability vector for multivariate normal processes and then, compares it with four recently proposed multivariate process capability indices to show its better performance. In addition, the effects of two modification factors are investigated. Also, bootstrap confidence intervals for the first component of the proposed vector are obtained. Furthermore, real manufacturing data sets are presented to demonstrate the applicability of the proposed vector.  相似文献   

3.
When the distribution of a process characteristic is non‐normal, Cp and Cpk calculated using conventional methods often lead to erroneous interpretation of the process's capability. Though various methods have been proposed for computing surrogate process capability indices (PCIs) under non‐normality, there is a lack of literature that covers a comprehensive evaluation and comparison of these methods. In particular, under mild and severe departures from normality, do these surrogate PCIs adequately capture process capability, and which is the best method(s) in reflecting the true capability under each of these circumstances? In this paper we review seven methods that are chosen for performance comparison in their ability to handle non‐normality in PCIs. For illustration purposes the comparison is done through simulating Weibull and lognormal data, and the results are presented using box plots. Simulation results show that the performance of a method is dependent on its capability to capture the tail behaviour of the underlying distributions. Finally we give a practitioner's guide that suggests applicable methods for each defined range of skewness and kurtosis under mild and severe departures from normality. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

4.
    
For process capability indices (PCIs) of non‐normal processes, the natural tolerance is defined as the difference between the 99.865 percentile and the 0.135 percentile of the process characteristic. However, some regions with relatively low probability density may still be included in this natural tolerance, while some regions with relatively high probability density may be excluded for asymmetric distributions. To take into account the asymmetry of process distributions and the asymmetry of tolerances from the viewpoint of probability density, the highest density interval is utilized to define the natural tolerance, and a family of new PCIs based on the highest density interval is proposed to ensure that regions with high probability density are included in the natural tolerance. Some properties of the proposed PCIs and two algorithms to compute the highest density interval are given. A real example is given to show the application of the proposed method. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

5.
基于工序成本的多元过程能力指数分析   总被引:1,自引:0,他引:1  
单一的多元过程能力指数在反映企业制造多元质量特性产品能力时存在缺陷,无法反映由于各工序成本的差异而造成的工序能力重要性的差别.基于此,从企业工序质量控制和工序成本关系的角度来分析,构建了基于工序成本修正的多元过程能力指数,反映企业的成本控制能力和产品竞争能力.最后,通过修正得到的多元过程能力指数,一方面优选企业的质量改进或控制方案,另一方面判定企业在同类产品制造中的实际质量控制能力和过程成本损失差异.  相似文献   

6.
    
《Quality Engineering》2012,24(2):190-202
ABSTRACT

The original Japanese process capability indices and Shewhart quality control charts (1939) were designed for use with independent, normally distributed data. When tracking inherently non-normal processes that tend to exhibit multiplicative rather than additive error variation, the options for statistical process monitoring and capability estimation are more limited. In particular, for zero-bound process variables such as flatness or parallelism, the normality of the process data is significantly distorted as the process improves and approaches its desired level of zero. In this article, we propose a process capability index estimation methodology for C p and C pk for the case of non-normal, zero-bound process data using the delta distribution, a variant of the lognormal distribution. This approach utilizes quantile estimates derived from a proposed modification of lognormal quality control charts (originally introduced by Morrison 1958 and Ferrell, 1958 Ferrell , E. B. ( 1958 ). Control charts for lognormal populations . Industrial Quality Control , 15 : 46 . [Google Scholar]), thus allowing statistical control to be tracked and achieved before index estimation. When process data are skewed, these process control and capability estimation techniques are superior to those that rely on normality assumptions; when the skewed data are also zero-bound, these techniques provide additional benefits over traditional quantile transform techniques.  相似文献   

7.
    
Much research effort has recently been focused on methods to deal with non‐normal populations. While for weak non‐normality the normal approximation is a useful choice (as in Shewhart control charts), moderate to strong skewness requires alternative approaches. In this short communication, we discuss the properties required from such approaches, and revisit two new ones. The first approach, for attributes data, assumes that the mean, the variance and the skewness measure can be calculated. These are then incorporated in a modified normal approximation, which preserves these moments. Extension of the Shewhart chart to skewed attribute distributions (e.g. the geometric distribution) is thus achieved. The other approach, for variables data, fit a member of a four‐parameter family of distributions. However, unlike similar approaches, sample estimates of at most the second degree are employed in the fitting procedure. This has been shown to result in a better representation of the underlying (unknown) distribution than methods based on four‐moment matching. Some numerical comparisons are given. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

8.
A normal distribution has a unique position in many engineering fields, and the standard normal distribution table has been widely used for more than a century. There are many situations, however, in which a truncated normal distribution needs to be considered. Although the theoretical foundations of the truncated normal distribution are well established, there has been little work on tabulating the characteristics associated with the truncated normal distribution, such as a cumulative probability, a truncated mean, and a truncated variance. In this article, we provide tables for a singly truncated normal distribution, which may be useful for quality practitioners.  相似文献   

9.
    
Two problems greatly affect the use of capability indices such as , and : the lack of affinity with the process fraction defective π and the difficulty of dealing with the sampling distributions of their natural estimators. In this paper, a capability index which is in one‐to‐one correspondence with π is introduced and simple inferential procedures under a Bayesian perspective are developed to facilitate its use in industrial applications. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

10.
工序能力分析与评价中的几个问题   总被引:3,自引:2,他引:3  
在工序质量分析与控制中,计算与评价工序能力指数是一项非常重要的工作,也是计算机辅助质量系统的一个重要模块。文章针对目前在工序能力计算与分析中出现的问题,提出了如何合理地进行抽样、样本数据的正态性检验以及对非正态性数据的处理、Cp的置信区间以及与样本含量的关系,旨在为实际生产过程中质量工程师进行工序能力分析和评价提供指导。  相似文献   

11.
    
In the context of process capability analysis, the results of most processes are dominated by two or even more quality characteristics, so that the assessment of process capability requires that all of them are considered simultaneously. In recent years, many researchers have developed different alternatives of multivariate capability indices using different approaches of construction. In this paper, four of them are compared through the study of their ability to correctly distinguish capable processes from incapable processes under a diversity of simulated scenarios, defining suitable minimum desirable values that allow to decide whether the process meets or does not meet specifications. In this sense, properties analyzed can be seen as sensitivity and specificity, assuming that a measure is sensitive if it can detect the lack of capability when it actually exists and specific if it correctly identifies capable processes. Two indices based on ratios of regions and two based on the principal component analysis have been selected for the study. The scenarios take into account several joint distributions for the quality variables, normal and non‐normal, several numbers of variables, and different levels of correlation between them, covering a wide range of possible situations. The results showed that one of the indices has better properties across most scenarios, leading to right conclusions about the state of capability of processes and making it a recommendable option for its use in real‐world practice. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

12.
    
Process capability indices (PCIs) have been widely used in the manufacturing industry providing numerical measures on process precision, accuracy and performance. Capability indices measures for processes with a single characteristic have been investigated extensively. However, an industrial product may have more than one quality characteristic. In order to establish performance measures for evaluating the capability of a multivariate manufacturing process, multivariate PCIs should be introduced. In this paper, we analyze the relationship between PCI and process yield. The PCI ECpk is proposed based on the idea of six sigma strategy, and there is a one‐to‐one relationship between ECpk index and process yield. Following the same, idea we propose a PCI MECpk to measure processes with multiple characteristics. MECpk index can evaluate the overall process yield of both one‐sided and two‐sided processes. We also analyze the effect of covariance matrix on overall process yield and suggest a solution for improving overall process yield. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

13.
Process capability indices (PCIs) are used to describe a manufacturing process expressing its ability to produce items within the specified limits. These indices are developed under the assumption that the underlying process distribution is normal. In industries, there are many manufacturing processes where process distribution can not be described by a normal distribution. In such cases, those PCIs will give misleading results about the process. The most commonly used approach for analysing a nonnormal process data is to fit a standard nonnormal distribution (e.g., weibull, gamma) or a family of distribution curves (e.g., Pearson, Johnson) to the process data and then to estimate the percentile points from the fitted distribution that can be used to compute generalized PCIs. In this article, we outline the procedure using the generalized lambda distribution (GLD) curve for modeling a set of process data and for estimating percentile points in order to compute generalized PCIs. The four-parameter GLD can assume a wide variety of curve shapes and hence it is very useful for the representation of data when the underlying model is unknown. Compared to the Pearson and Johnson family of distributions, the GLD is computationally simpler and more flexible. The article provides all necessary formulas for fitting a GLD curve, estimating its parameters, performing goodness-of-fit tests, and computing generalized PCIs. An example is used to illustrate the calculations that can be easily performed using spreadsheets.  相似文献   

14.
    
Although the recently proposed Weibull process capability indices (PCIs) actually measure the times that the standard deviation (σx) is within the tolerance specifications, because they not accurately estimate neither the log‐mean (μx) nor the σx values, then the actual PCIs are biased. This actually because μx and σx are both estimated without considering the effect that the sample size (n) has over their values. Hence, μx is subestimated and σx is overestimated. As a response to this issue, in this paper, μx and σx are estimated in function of n. In particular, the PCIs' efficiency is based on the following facts: (1) the derived n value is unique and it completely determines η, (2) the μx value completely determines the η value, and (3) the σx value completely determines the β value. Thus, now, since μx and σx are in function of n and they completely determine β and η, then the proposed PCIs are unbiased, and they completely represent the analyzed process also. Finally, a step by step numerical application is given.  相似文献   

15.
16.
    
Within an industrial manufacturing environment, Process Capability Indices (PCIs) enable engineers to assess the process performance and ultimately improve the product quality. Despite the fact that most industrial products manufactured today possess multiple quality characteristics, the vast majority of the literature within this area primarily focuses on univariate measures to assess process capability. One particular univariate index, Cpm, is widely used to account for deviations between the location of the process mean and the target value of a process. While some researchers have sought to develop multivariate analogues of Cpm, modeling the loss in quality associated with multiple quality characteristics continues to remain a challenge. This paper proposes a multivariate PCI that more appropriately estimates quality loss, while offering greater flexibility in conforming to various industrial applications, and maintaining a more realistic approach to assessing process capability. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

17.
There have been many investigations on the capability indices Cp, Cpk, Cpm, and Cpmk, for the common situation in which the target is the midpoint of the tolerance interval. However, only a few investigations deal with the specific case of asymmetrical tolerances. In that particular case, a number of symmetrical and asymmetrical indices are put forward, but there is no full literature treatment or synthesis showing the similarity between those indices and the common ones. We intend here, to demonstrate that the algebraic links between the indices Cp, Cpk, Cpm, and Cpmk, are similar to the ones which relate the symmetrical indices proposed in the case of asymmetrical tolerances. In that case, the algebraic structure allows us to propose asymmetrical indices families. An example based on a pharmaceutical filling operation is used to illustrate the application.  相似文献   

18.
    
《Quality Engineering》2012,24(1):24-32
  相似文献   

19.
    
One of the main objectives of response surface methodology is to find the operating settings that optimize the mean function. When estimating the optimum settings, it is highly important to take the response variance into account. Data transformations are frequently used to eliminate variance heterogeneity. Important references in response surface methodology such as Box and Draper 4 and Myers et al 1 recommend transforming the data prior to process optimization, if needed. Process optimization is initialized if the response transformation successfully stabilizes the variance. In this paper, I oppose using such a practice without complete understanding of its implications. It basically implies that variation is a key characteristic of the process understudy and postulates relationship between the mean and the variance. When ignoring this relationship, the optimum settings found on the transformed scale may have very high variance. A solution based on ridge analysis is presented. Practitioners must proceed with caution when applying data transformation to their datasets.  相似文献   

20.
    
In the statistical literature on the study of the capability of processes through the use of indices, Cpm appears to have been one of the most widely used capability indices and its estimation has attracted much interest. In this article, a new method for constructing approximate confidence intervals or lower confidence limits for this index is suggested. The method is based on an approximation of the non‐central chi‐square distribution, which was proposed by Pearson. Its coverage appears to be more satisfactory compared with that achieved by any of the two most widely used methods that were proposed by Boyles, in situations where one is interested in assessing a lower confidence limit for Cpm. This is supported by the results of an extensive simulation study. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号