首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Monitoring a fraction arises in many manufacturing applications and also in service applications. The traditional p‐chart is easy to use and design but is difficult to achieve the desired false alarm rate. We propose a two‐sided CUSUM Arcsine method that achieves both large and small desired false alarm rates for an in‐control probability anywhere between 0 and 1. The parameters of the new method are calculated easily, without tables, simulation, or Markov chain analysis used by many of the existing methods. The proposed method detects increases and decreases and works for constant and Poisson distributed sample sizes. The CUSUM Arcsine also has a superior sensitivity compared with other easily designed existing methods for monitoring Binomial distributed data. This paper includes an extensive literature review and a taxonomy of the existing monitoring methods for a fraction. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

2.
In many applications, the Poisson count data with varying sample sizes are monitored using statistical process control charts. Among these applications, the weighted CUSUM charts are developed to deal with the effect of the varying sample sizes. However, some of them use limited information of the sample size or the count data while assigning the weights. To gain more information of the process, the self-information weight functions are developed based on both the sample size and the observed count data. Then, the weighted CUSUM charts are proposed with the self-information-based weight. Simulation studies show the self-information-based weighted CUSUM charts perform better than the benchmark methods in detecting small shifts. Moreover, the performance of proposed method with estimated parameters is investigated via simulation. Finally, an example is given to illustrate the application of the proposed weighted CUSUM charts.  相似文献   

3.
Intrusion detection systems have a vital role in protecting computer networks and information systems. In this article, we applied a statistical process control (SPC)–monitoring concept to a certain type of traffic data to detect a network intrusion. We proposed an SPC‐based intrusion detection process and described it and the source and the preparation of data used in this article. We extracted sample data sets that represent various situations, calculated event intensities for each situation, and stored these sample data sets in the data repository for use in future research. This article applies SPC charting methods for intrusion detection. In particular, it uses the basic security module host audit data from the MIT Lincoln Laboratory and applies the Shewhart chart, the cumulative sum chart, and the exponential weighted moving average chart to detect a denial of service intrusion attack. The case study shows that these SPC techniques are useful for detecting and monitoring intrusions. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

4.
The Poisson distribution plays a dominant role in the determination of the mean value of a distribution of the number of defective units (e.g. tablets, capsules) per sample, based on several samples of same size. If, however, the data emanates from samples with at least one defective unit in each sample, involving the absence of the zero-defective category, then the formula of the Poisson distribution as well as of the mean number of defective units are no longer tenable. In this presentation, appropriate formula for the Poisson distribution, called the truncated Poisson distribution, and for the mean, θ, are developed. The maximum likelihood method of estimation of the parameter θ by employing numerical (iterative) analysis methods is depicted, in detail. The procedure for conducting the chi-square test of goodness of fit of the experimental data to the truncated Poisson distribution is demonstrated. The results of the analyses of two recent experiments based on the methods described above are presented and appropriately interpreted.  相似文献   

5.
A multivariate dispersion control chart monitors changes in the process variability of multiple correlated quality characteristics. In this article, we investigate and compare the performance of charts designed to monitor variability on the basis of individual and grouped multivariate observations. We compare one of the most well-known methods for monitoring individual observations—a multivariate exponentially weighted mean squared deviation (MEWMS) chart—with various charts based on grouped observations. In addition, we compare charts based on monitoring with overlapping and nonoverlapping subgroups. We recommend using charts based on overlapping subgroups when monitoring with subgroup data. The effect of subgroup size is also investigated. Steady-state average time to signal is used as the performance measure. We show that monitoring methods based on individual observations are the quickest in detecting sustained shifts in the process variability. We use a simulation study to obtain our results and illustrated these with a case study.  相似文献   

6.
Abstract

The Poisson distribution plays a dominant role in the determination of the mean value of a distribution of the number of defective units (e.g. tablets, capsules) per sample, based on several samples of same size. If, however, the data emanates from samples with at least one defective unit in each sample, involving the absence of the zero-defective category, then the formula of the Poisson distribution as well as of the mean number of defective units are no longer tenable. In this presentation, appropriate formula for the Poisson distribution, called the truncated Poisson distribution, and for the mean, θ, are developed. The maximum likelihood method of estimation of the parameter θ by employing numerical (iterative) analysis methods is depicted, in detail. The procedure for conducting the chi-square test of goodness of fit of the experimental data to the truncated Poisson distribution is demonstrated. The results of the analyses of two recent experiments based on the methods described above are presented and appropriately interpreted.  相似文献   

7.
This paper deals with the simultaneous statistical process control of several Poisson variables. The practitioner of this type of monitoring may employ a multiple scheme, i.e. one chart for controlling each variable, or may use a multivariate scheme, based on monitoring all the variables with a single control chart. If the user employs the multivariate schemes, he or she can choose from, for example, three options: (i) a control chart based on the sum of the different Poisson variables; (ii) a control chart on the maximum value of the different Poisson variables; and (iii) in the case of only two variables, a chart that monitors the difference between them. In this paper, the previous control charts are studied when applied to the control of p = 2, 3 and 4 variables. In addition, the optimization of a set of univariate Poisson control charts (multiple scheme) is studied. The main purpose of this paper is to help the practitioner to select the most adequate scheme for her/his production process. Towards this goal, a friendly Windows© computer program has been developed. The program returns the best control limits for each control chart and makes a complete comparison of performance among all the previous schemes. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

8.
This paper proposes a new space–time cumulative sum (CUSUM) approach for detecting changes in spatially distributed Poisson count data subject to linear drifts. We develop expressions for the likelihood ratio test monitoring statistics and the change point estimators. The effectiveness of the proposed monitoring approach in detecting and identifying trend-type shifts is studied by simulation under various shift scenarios in regional counts. It is shown that designing the space–time monitoring approach specifically for linear trends can enhance the change point estimation accuracy significantly. A case study for male thyroid cancer outbreak detection is presented to illustrate the application of the proposed methodology in public health surveillance.  相似文献   

9.
This paper compares the economic performance of CUSUM and Shewhart schemes for monitoring the process mean. We develop new simple models for the economic design of Shewhart schemes and more accurate ways to evaluate the economic performance of CUSUM schemes. The results of the comparative analysis show that the economic advantage of using a CUSUM scheme rather than the simpler Shewhart chart is substantial only when a single measurement is available at each sampling instance, i.e., only when the sample size is always n = 1, or when the sample size is constrained to low values.  相似文献   

10.
In real life applications, many process‐monitoring problems in statistical process control are based on attribute data resulting from quality characteristics that cannot be measured on numerical or quantitative scales. For the monitoring of such data, a new attribute control chart has been proposed in this study, namely, the Poisson progressive Mean (PPM) control chart. The performance of the PPM chart is compared with the existing charts used for the monitoring of Poisson processes such as the Shewhart c‐chart, Poisson Exponentially Weighted Moving Average chart, Poisson double Exponentially Weighted Moving Average chart and the Poisson Cumulative Sum charts. The average run length comparison indicated the superior performance of the PPM chart in terms of shift detection ability. This study will help quality practitioners to choose an efficient attribute control chart.  相似文献   

11.
In many industrial scenarios, on‐line monitoring of quality characteristics computed as the ratio of two normal random variables can be required. Potential industrial applications can include monitoring of processes where the correct proportion of a property between two ingredients or elements within a product should be maintained under statistical control; implementation of quality control procedures where the performance of a product is measured as a ratio before and after some specific operation, for example a chemical reaction following the introduction of an additive in a product and monitoring of a chemical or physical property of a product, which is itself defined and computed as a ratio. This paper considers Phase II Shewhart control charts with each subgroup consisting of n > 1 sample units. From one subgroup to another, the size of each sample unit, upon which a single measurement is made, can be changed. An approximation based on the normal distribution is used to efficiently handle the ratio distribution. Several tables are generated and commented to show the statistical performance of the investigated chart for known and random shift sizes affecting the in‐control ratio. An illustrative example from the food industry is given for illustration. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

12.
We consider the problem of monitoring a proportion with time-varying sample sizes. Control charts are generally designed by assuming a fixed sample size or a priori knowledge of a sample size probability distribution. Sometimes, it is not possible to know, or accurately estimate, a sample size distribution or the distribution may change over time. An improper assumption for the sample size distribution could lead to undesirable performance of the control chart. To handle this problem, we propose the use of dynamic probability control limits (DPCLs) which are determined successively as the sample sizes become known. The method is based on keeping the conditional probability of a false alarm at a predetermined level given that there has not been any earlier false alarm. The control limits dynamically change, and the in-control performance of the chart can be controlled at the desired level for any sequence of sample sizes. The simulation results support this result showing that there is no need for any assumption of a sample size distribution with the use of this proposed approach.  相似文献   

13.
The traditional process monitoring techniques used to study high-quality processes have several demerits, that is, high-false alarm rate and poor detection, etc. A recent and promising idea to monitor such processes is the use of time-between-events (TBE) control charts. However, the available TBE control charts have been developed in a nonadaptive fashion assuming the Poisson process. There are many situations where we need adaptive monitoring, for example, health, flood, food, system, or terrorist surveillance. Therefore, the existing control charts are not useful, especially in sequential monitoring. This article introduces new adaptive TBE control charts for high-quality processes based on the nonhomogeneous Poisson process by assuming the power law intensity. In particular, probability control limits are used to develop control charts. The proposed methodology allows us to get control limits that are dynamic and suitable for online process monitoring with an additional advantage to monitor a process where we believe the underlying failure rate may be changing over time. The average run length and coefficient of variation of the run length distribution are used to assess the performance of the proposed control charts. Besides simulation studies, we also discuss three examples to highlight the application of the proposed charts.  相似文献   

14.
Bernoulli processes have been monitored using a wide variety of techniques in statistical process control. The data consist of information on successive items classified as conforming (nondefective) or nonconforming (defective). In some cases, the probability of obtaining a nonconforming item is very small; this is known as a high quality process. This area of statistical process control is also applied to health‐related monitoring, where the incidence rate of a medical problem such as a congenital malformation is of interest. In these applications, standard Shewhart control charts based on the binomial distribution are no longer useful. In our expository paper, we review the methods implemented for these scenarios and present ideas for future work in this area. We offer advice to practitioners and present a comprehensive literature review for researchers. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

15.
The importance of statistical process control (SPC) techniques in quality improvement is well recognized in industry. However, most conventional SPC techniques have been developed under the assumption of independent, identically and normally distributed observations. With advances in sensing and data capturing technologies, large volumes of data are being routinely collected from individual units in manufacturing industries. These data are often autocorrelated and skewed. Conventional SPC techniques can lead to false alarms or other types of poor performance monitoring of such data. There is a great need for process control techniques for variation reduction in these environments. Much recent research has focused on the development of appropriate SPC techniques for autocorrelated data, but few studies have considered the impact of non‐normality on these techniques. This paper investigates the effect of skewness on conventional autocorrelated SPC techniques, and provides an effective approach based on a scaled weighted variance approach to improve SPC performance in such an environment. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

16.
Recently there has been an increasing interest in techniques of process monitoring involving geometrically distributed quality characteristics, as many types of attribute data are neither binomial nor Poisson distributed. The geometric distribution is particularly useful for monitoring high‐quality processes based on cumulative counts of conforming items. However, a geometrically distributed quantity can never be adequately approximated by a normal distribution that is typically used for setting 3‐sigma control limits. In this paper, some transformation techniques that are appropriate for geometrically distributed quantities are studied. Since the normal distribution assumption is used in run‐rules and advanced process‐monitoring techniques such as the cumulative sum or exponentially weighted moving average chart, data transformation is needed. In particular, a double square root transformation which can be performed using simple spreadsheet software can be applied to transform geometrically distributed quantities with satisfactory results. Simulated and actual data are used to illustrate the advantages of this procedure. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

17.
Baddeley D  Carl C  Cremer C 《Applied optics》2006,45(27):7056-7064
To remove the axial sidelobes from 4Pi images, deconvolution forms an integral part of 4Pi microscopy. As a result of its high axial resolution, the 4Pi point spread function (PSF) is particularly susceptible to imperfect optical conditions within the sample. This is typically observed as a shift in the position of the maxima under the PSF envelope. A significantly varying phase shift renders deconvolution procedures based on a spatially invariant PSF essentially useless. We present a technique for computing the forward transformation in the case of a varying phase at a computational expense of the same order of magnitude as that of the shift invariant case, a method for the estimation of PSF phase from an acquired image, and a deconvolution procedure built on these techniques.  相似文献   

18.
This article proposes two Shewhart charts, denoted npxy and npw charts, which use attribute inspection to control the mean vector (μx; μy)′ of bivariate processes. The units of the sample are classified as first‐class, second‐class, or third‐class units, according to discriminate limits and the values of their two quality characteristics, X and Y. When the npxy chart is in use, the monitoring statistic is M = N1 + N2, where N1 and N2 are the number of sample units with a second‐class and third‐class classification, respectively. When the npw chart is in use, the monitoring statistic is W = N1 + 2N2. We assume that the quality characteristics X and Y follow a bivariate normal distribution and that the assignable cause shifts the mean vector without changing the covariance matrix. In general, the synthetic npxy and npw charts require twice larger samples to outperform the T2 chart. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

19.
An efficient surveillance system is a crucial factor in identifying, monitoring and tackling outbreaks of infectious diseases. Scarcity of data and limited amounts of economic resources require a targeted effort from public health authorities. In this paper, we propose a mathematical method to identify areas where surveillance is critical and low reporting rates might leave epidemics undetected. Our approach combines the use of reference-based susceptible–exposed–infectious models and observed reporting data; We propose two different specifications, for constant and time-varying surveillance, respectively. Our case study is centred around the spread of the raccoon rabies epidemic in the state of New York, using data collected between 1990 and 2007. Both methods offer a feasible solution to analyse and identify areas of intervention.  相似文献   

20.
An efficient alternative to the S control chart for detecting shifts of small magnitude in the process variability using a moving average based on the sample standard deviation s statistic is proposed. Control limit factors are derived for the chart for different values of sample size and span w. The performance of the moving average S chart is compared to the S chart in terms of average run length. The result shows that the performance of moving average S chart for varying values of w outweigh those of the S chart for small and moderate shifts in process variability.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号