首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This article proposes a Cumulative Sum (CUSUM) scheme, called the TC‐CUSUM scheme, for monitoring a negative or hazardous event. This scheme is developed using a two‐dimensional Markov model. It is able to check both the time interval (T) between occurrences of the event and the size (C) of each occurrence. For example, a traffic accident may be defined as an event, and the number of injured victims in each case is the event size. Our studies show that the TC‐CUSUM scheme is several times more effective than many existing charts for event monitoring, so that cost or loss incurred by an event can be reduced by using this scheme. Moreover, the TC‐CUSUM scheme performs more uniformly than other charts for detecting both T shift and C shift, as well as the joint shift in T and C. The improvement in the performance is achieved because of the use of the CUSUM feature and the simultaneous monitoring of T and C. The TC‐CUSUM scheme can be applied in manufacturing systems, and especially in non‐manufacturing sectors (e.g. supply chain management, health‐care industry, disaster management, and security control). Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

2.
Because of the characteristics of a system or process, several prespecified changes may happen in some statistical process control applications. Thus, one possible and challenging problem in profile monitoring is detecting changes away from the ‘normal’ profile toward one of several prespecified ‘bad’ profiles. In this article, to monitor the prespecified changes in linear profiles, two two‐sided cumulative sum (CUSUM) schemes are proposed based on Student's t‐statistic, which use two separate statistics and a single statistic, respectively. Simulation results show that the CUSUM scheme with a single statistic uniformly outperforms that with two separate statistics. Besides, both CUSUM schemes perform better than alternative methods in detecting small shifts in prespecified changes, and become comparable on detecting moderate or large shifts when the number of observations in each profile is large. To overcome the weakness in the proposed CUSUM methods, two modified CUSUM schemes are developed using z‐statistic and studied when the in‐control parameters are estimated. Simulation results indicate that the modified CUSUM chart with a single charting statistic slightly outperforms that with two separate statistics in terms of the average run length and its standard deviation. Finally, illustrative examples indicate that the CUSUM schemes are effective. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

3.
Traditional Duncan‐type models for cost‐efficient process monitoring often inflate type I error probability. Nevertheless, controlling the probability of type I error or false alarms is one of the key issues in sequential monitoring of specific process characteristics. To this end, researchers often recommend economic‐statistical designs. Such designs assign an upper bound on type I error probability to avoid excessive false alarms while achieving cost optimality. In the context of process monitoring, there is a plethora of research on parametric approaches of controlling type I error probability along with the cost optimization. In the nonparametric setup, most of the existing works on process monitoring address one of the two issues but not both simultaneously. In this article, we present two distribution‐free cost‐efficient Shewhart‐type schemes for sequentially monitoring process location with restricted false alarm probability, based, respectively, on the sign and Wilcoxon rank‐sum statistics. We consider the one‐sided shift in location parameter in an unknown continuous univariate process. Nevertheless, one can easily extend our proposed schemes to monitor the two‐sided process shifts. We evaluate and compare the actual performance of the two monitoring schemes employing extensive computer simulation based on Monte Carlo. We investigate the effects of the size of the reference sample and the false alarm constraint. Finally, we provide two illustrative examples, each based on a realistic situation in the industry.  相似文献   

4.
Control chart techniques for high‐quality process have attracted great attention in modern precision manufacturing. Traditional control charts are no longer applicable because of high false alarm rate. To solve this problem, in this article a new statistical process monitoring method, the counted number between omega‐event statistical process control charts, abbreviated as CBΩ charts, is proposed. The phrase omega event denotes that one observation falls into some certain interval and the CBΩ chart is to monitor the number of consecutive parts between successive r omega events. On the basis of CBΩ charts, a dual‐CBΩ monitoring scheme is developed. This scheme sets up two CBΩ charts with symmetrical omega events, (μ + , + ) and (? , μ ? ), respectively. The performance of CBΩ charts and dual‐CBΩ monitoring is investigated. Dual‐CBΩ monitoring has shown its capability in detecting both mean and variance shift and convenience in implementation compared with other traditional charts. Dual‐CBΩ monitoring can reduce false alarm rate greatly without introducing an unacceptable loss of sensitivity in detecting out‐of‐control signals in high‐quality process control. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

5.
Multivariate monitoring techniques for serially correlated observations have been widely used in various applications. This study examines several issues that have arisen in relation to the statistical quality control for the vector autoregressive (VAR) model, using a Monte Carlo approach. Different versions of the Hotelling T2 statistic and control limits to monitor the VAR‐type process for both Phase I and Phase II schemes can be specified for different sample sizes and configurations of the model. Our simulation study suggests that the Hotelling's T2 statistic can be tested against the χ2 critical values during Phase I, but should be tested against scaled F critical values during Phase II. An unbiased covariance estimate of residuals is also recommended during Phase II when sample size is typically small. By reanalyzing some real data examples, the authors offer new conclusions. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
Three‐dimensional elastic–plastic finite element analyses have been conducted for 21 experimental specimens with different in‐plane and out‐of‐plane constraints in the literature. The distributions of five constraint parameters (namely T‐stress, Q, h, Tz and Ap) along crack fronts (specimen thickness) for the specimens were calculated. The capability and applicability of the parameters for characterizing in‐plane and out‐of‐plane crack‐tip constraints and establishing unified correlation with fracture toughness of a steel were investigated. The results show that the four constraint parameters (T‐stress, Q, h and Tz) based on crack‐tip stress fields are only sensitive to in‐plane or out‐of‐plane constraints. Therefore, the monotonic unified correlation curves with fracture toughness (toughness loci) cannot obtained by using them. The parameter Ap based on crack‐tip equivalent plastic strain is sensitive to both in‐plane and out‐of‐plane constraints, and may effectively characterize both of them. The monotonic unified correlation curves with fracture toughness can be obtained by using Ap. In structural integrity assessments, the correlation curves may be used in the failure assessment diagram (FAD) methodology for incorporating both in‐plane and out‐of‐plane constraint effects in structures for improving accuracy.  相似文献   

7.
In recent years, several techniques based on control charts have been developed for the simultaneous monitoring of the time interval T and the amplitude X of events, known as time-between-events-and-amplitude (TBEA) charts. However, the vast majority of the existing works have some limitations. First, they usually focus on statistics based on the ratio X T , and second, they only investigate a reduced number of potential distributions, that is, the exponential distribution for T and the normal distribution for X. Moreover, until now, very few research papers have considered the potential dependence between T and X. In this paper, we investigate three different statistics, denoted as Z1 , Z2 , and Z3 , for monitoring TBEA data in the case of three potential distributions (gamma, normal, and Weibull), for both T and X, using copulas as a mechanism to model the dependence. An illustrative example considering times between machine breakdowns and associated maintenance illustrates the use of TBEA control charts.  相似文献   

8.
In the last 5 years, research works on distribution‐free (nonparametric) process monitoring have registered a phenomenal growth. A Google Scholar database search on early September 2015 reveals 246 articles on distribution‐free control charts during 2000–2009 and 466 articles in the following years. These figures are about 1400 and 2860 respectively if the word ‘nonparametric’ is used in place of ‘distribution‐free’. Distribution‐free charts do not require any prior knowledge about the process parameters. Consequently, they are very effective in monitoring various non‐normal and complex processes. Traditional process monitoring schemes use two separate charts, one for monitoring process location and the other for process scale. Recently, various schemes have been introduced to monitor the process location and process scale simultaneously using a single chart. Performance advantages of such charts have been clearly established. In this paper, we introduce a new graphical device, namely, circular‐grid charts, for simultaneous monitoring of process location and process scale based on Lepage‐type statistics. We also discuss general form of Lepage statistics and show that a new modified Lepage statistic is often better than the traditional of Lepage statistic. We offer a new and attractive post‐signal follow‐up analysis. A detailed numerical study based on Monte‐Carlo simulations is performed, and some illustrations are provided. A clear guideline for practitioners is offered to facilitate the best selection of charts among various alternatives for simultaneous monitoring of location‐scale. The practical application of the charts is illustrated. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

9.
Standard multivariate statistical process control (SPC) techniques, such as Hotelling's T2, cannot easily handle large‐scale, complex process data and often fail to detect out‐of‐control anomalies for such data. We develop a computationally efficient and scalable Chi‐Square ( ) Distance Monitoring (CSDM) procedure for monitoring large‐scale, complex process data to detect out‐of‐control anomalies, and test the performance of the CSDM procedure using various kinds of process data involving uncorrelated, correlated, auto‐correlated, normally distributed, and non‐normally distributed data variables. Based on advantages and disadvantages of the CSDM procedure in comparison with Hotelling's T2 for various kinds of process data, we design a hybrid SPC method with the CSDM procedure for monitoring large‐scale, complex process data. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

10.
This article proposes an integrated scheme (T&TCUSUM chart) which combines a Shewhart T chart and a TCUSUM chart (a CUSUM‐type T chart) to monitor the time interval T between the occurrences of an event or the time between events. The performance studies show that the T&TCUSUM chart can effectively improve the overall performance over the entire T shift range. On average, it is more effective than the T chart by 26.66% and the TCUSUM chart by 14.12%. Moreover, the T&TCUSUM chart performs more consistently than other charts for the detection of either small or large T shifts, because it has the strength of both the T chart (more sensitive to large shifts) and the TCUSUM chart (more sensitive to small shifts). The implementation of the new chart is almost as easy as the operation of a TCUSUM chart. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

11.
A transition metal diphosphide, WP2, is a candidate for type‐II Weyl semimetals (WSMs) in which spatial inversion symmetry is broken and Lorentz invariance is violated. As one of the prerequisites for the presence of the WSM state in WP2, spatial inversion symmetry breaking in this compound has rarely been investigated. Furthermore, the anisotropy of the WP2 electrical properties and whether its electrical anisotropy can be tuned remain elusive. Angle‐resolved polarized Raman spectroscopy, electrical transport, optical spectroscopy, and first‐principle studies of WP2 are reported. The energies of the observed Raman‐active phonons and the angle dependences of the detected phonon intensities are consistent with results obtained by first‐principle calculations and analysis of the proposed crystal symmetry without spatial inversion, showing that spatial inversion symmetry is broken in WP2. Moreover, the measured ratio (Rc /Ra ) between the crystalline c‐axis and a‐axis electrical resistivities exhibits a weak dependence on temperature (T) in the temperature range from 100 to 250 K, but increases abruptly at T ≤ 100 K, and then reaches the value of ≈8.0 at T = 10 K, which is by far the strongest in‐plane electrical resistivity anisotropy among the reported type‐II WSM candidates with comparable carrier concentrations. Optical spectroscopy study, together with the first‐principle calculations on the electronic band structure, reveals that the abrupt enhancement of the electrical resistivity anisotropy at T ≤ 100 K mainly arises from a sharp increase in the scattering rate anisotropy at low temperatures. More interestingly, the Rc /Ra of WP2 at T = 10 K can be tuned from 8.0 to 10.6 as the magnetic field increases from 0 to 9 T. The so‐far‐strongest and magnetic‐field‐tunable electrical resistivity anisotropy found in WP2 can serve as a degree of freedom for tuning the electrical properties of type‐II WSMs, which paves the way for the development of novel electronic applications based on type‐II WSMs.  相似文献   

12.
In this paper, the interest is focused on monitoring profiles with Weibull distributed‐response and common shape parameter γ in phase II processes. The monitoring of such profiles is completely possible by taking the natural logarithm of the Weibull‐distributed response. This is equivalent to characterize the correspondent process by an extreme value linear regression model with common scale parameter σ = γ?1. It was found out that from the monitoring of the common log‐scale parameter of the extreme value linear regression model, with the help of a simple scheme, it can be obtained important information about the deterioration of the entire process assuming the β coefficients as nuissance parameters that do not have to be known but stable. Control charts are based on the relative log‐likelihood ratio statistic defined for the log‐scale parameter of the log‐transformation of the Weibull‐distributed response and its respective signed square root. It was also found out that some existing adjustments are needed in order to improve the accuracy of using the distributional properties of the monitoring statistics for relatively small and moderate sample sizes. Simulation studies suggest that resulting charts have appealing properties and work fairly acceptable when non‐large enough samples are available at discrete sampling moments. Detection abilities of the studied corrected control schemes improve when sample size increases. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

13.
An R chart is often used to monitor shifts in the process variability. However, the range, , statistics from a sampling distribution are highly skewed. Hence, the classical R chart based on the control limits will not give an in‐control average run length of approximately 370, or equivalently a type I error, . In this paper, an approach is shown to obtain the control limits of an improved R chart based on a desired type I error from the density function of the Ri statistics. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

14.
Numerical schemes for the approximative solution of advection–diffusion–reaction equations are often flawed because of spurious oscillations, caused by steep gradients or dominant advection or reaction. In addition, for strong coupled nonlinear processes, which may be described by a set of hyperbolic PDEs, established time stepping schemes lack either accuracy or stability to provide a reliable solution. In this contribution, an advanced numerical scheme for this class of problems is suggested by combining sophisticated stabilization techniques, namely the finite calculus (FIC‐FEM) scheme introduced by Oñate et al. with time‐discontinuous Galerkin (TDG) methods. Whereas the former one provides a stabilization technique for the numerical treatment of steep gradients for advection‐dominated problems, the latter ensures reliable solutions with regard to the temporal evolution. A brief theoretical outline on the superior behavior of both approaches will be presented and underlined with related computational tests. The performance of the suggested FIC‐TDG finite element approach will be discussed exemplarily on a bioregulatory model for bone fracture healing proposed by Geris et al., which consists of at least 12 coupled hyperbolic evolution equations. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

15.
A cluster‐based method has been used successfully to analyze parametric profiles in Phase I of the profile monitoring process. Performance advantages have been demonstrated when using a cluster‐based method of analyzing parametric profiles over a non‐cluster based method with respect to more accurate estimates of the parameters and improved classification performance criteria. However, it is known that, in many cases, profiles can be better represented using a nonparametric method. In this study, we use the cluster‐based method to analyze profiles that cannot be easily represented by a parametric function. The similarity matrix used during the clustering phase is based on the fits of the individual profiles with p‐spline regression. The clustering phase will determine an initial main cluster set that contains greater than half of the total profiles in the historical data set. The profiles with in‐control T2 statistics are sequentially added to the initial main cluster set, and upon completion of the algorithm, the profiles in the main cluster set are classified as the in‐control profiles and the profiles not in the main cluster set are classified as out‐of‐control profiles. A Monte Carlo study demonstrates that the cluster‐based method results in superior performance over a non‐cluster based method with respect to better classification and higher power in detecting out‐of‐control profiles. Also, our Monte Carlo study shows that the cluster‐based method has better performance than a non‐cluster based method whether the model is correctly specified or not. We illustrate the use of our method with data from the automotive industry. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

16.
A class of parallel multiple‐front solution algorithms is developed for solving linear systems arising from discretization of boundary value problems and evolution problems. The basic substructuring approach and frontal algorithm on each subdomain are first modified to ensure stable factorization in situations where ill‐conditioning may occur due to differing material properties or the use of high degree finite elements (p methods). Next, the method is implemented on distributed‐memory multiprocessor systems with the final reduced (small) Schur complement problem solved on a single processor. A novel algorithm that implements a recursive partitioning approach on the subdomain interfaces is then developed. Both algorithms are implemented and compared in a least‐squares finite‐element scheme for viscous incompressible flow computation using h‐ and p‐finite element schemes. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

17.
Control charting methods for time between events (TBE) is important in both manufacturing and nonmanufacturing fields. With the aim to enhance the speed for detecting shifts in the mean TBE, this paper proposes a generalized group runs TBE chart to monitor the mean TBE of a homogenous Poisson failure process. The proposed chart combines a TBE subchart and a generalized group conforming run length subchart. The zero‐state and steady‐state performances of the proposed chart were evaluated by applying a Markov chain method. Overall, it is found that the proposed chart outperforms the existing TBE charts, such as the T, Tr, EWMA‐T, Synth‐Tr, and GR‐Tr charts. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

18.
19.
We discuss explicit coupling schemes for fluid‐structure interaction problems where the added mass effect is important. In this paper, we show the close relation between coupling schemes by using Nitsche's method and a Robin–Robin type coupling. In the latter case, the method may be implemented either using boundary integrals of the stresses or the more conventional discrete lifting operators. Recalling the explicit method proposed in Comput. Methods Appl. Mech. Engrg. 198(5‐8):766–784, 2009, we make the observation that this scheme is stable under a hyperbolic type CFL condition, but that optimal accuracy imposes a parabolic type CFL conditions because of the splitting error. Two strategies to enhance the accuracy of the coupling scheme under the hyperbolic CFL‐condition are suggested, one using extrapolation and defect‐correction and one using a penalty‐free non‐symmetric Nitsche method. Finally, we illustrate the performance of the proposed schemes on some numerical examples in two and three space dimensions. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

20.
Abstract

The performance of high‐resolution total variation diminishing (TVD) schemes for simulating dam‐break problems are presented and evaluated. Three robust and reliable first‐order upwind schemes, namely FVS, Roe and HLLE schemes, are extended to six second‐order TVD schemes using two different approaches, the Sweby flux limiter approach and the direct MUSCL‐Hancock slope limiter. For idealized dam‐break flows, comparisons of the simulated results with the exact solutions show that the flux vector splitting (FVS) scheme coupled with the direct MUSCL‐Hancock (DMH) slope limiter approach has the best numerical performance among the presented schemes. Application of the FVS‐DMH scheme to a dam‐break experiment with sloping dry bed shows that the simulated water depths agree well with the measured.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号