首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Process monitoring of full mass production phase of multistage manufacturing processes (MMPs) has been successfully implemented in many applications; however, monitoring of ramp-up phase of MMPs is often more difficult to conduct due to the limited information to establish valid process control parameters (such as mean and variance). This paper focuses on the estimation of the process control parameters used for monitoring scheme design of ramp-up phase of MMPs. An engineering model of variation propagation of an MMP is developed and reconstructed to a linear model, establishing a relationship between the error sources and the variation of product characteristics. Based on the developed linear model, a two-step Bayesian method is proposed to estimate the process control parameters. The performance of the proposed Bayesian method is validated with simulation data and real-world data, and the results demonstrate that the proposed method can effectively estimate process parameters during ramp-up phase of MMP.  相似文献   

2.
Process capability indices (PCIs) have been widely adopted for quality assurance activities. By analysing PCIs, a production department can trace and improve a poor process to enhance product quality level and satisfy customer requirements. Among these indices, Cpk remains the most prevalent for facilitating managerial decisions because it can provide bounds on the process yield for normally distributed processes. However, processes are often non-normal in practice, and Cpk may quite likely misrepresent the actual product quality. Hence, the flexible index Cjkp, which considers possible differences in the variability above and below the target value, has been developed for practical use. However, Cjkp continues to suffer from serious bias in assessing actual capability, especially when the process distribution is highly skewed. In this paper, we modify Cjkp for assessing the actual process quality of a Gamma process. A correction factor is obtained by the curve-fitting method. The results show that our proposed method can significantly reduce the bias for calculation of actual nonconformities. Moreover, we introduce a sample estimator for our modified index. The ratio of this estimator’s average value and the modified index is approximately 1. This implies that our proposed estimator can provide an appropriate estimation for assessing the actual Gamma process quality.  相似文献   

3.
The effects of the raw materials onto the product's final quality attributes (FQA) will be combined with the effect from the process operation, and from any differences of scale. A new mathematical framework is proposed to help understand and decouple the effects of raw materials, scale, process conditions and control system onto the FQA for a given product. Ultimately it provides a quantitative way to establish a quality driven multivariate specification for the incoming materials, using data from multiple scales to support it. The new technique is based on multivariate latent variable methods (LVM) coupled with optimization techniques. This proposed method can be used as the basis to quantitatively support a design-space definition that takes into account the inherent variability in the FQA that is brought in by the process operation, the control system and the materials. The methodology is illustrated herein building a multivariate specification for the purchase of a polymeric excipient for the production of tablets.  相似文献   

4.
Controlling and reducing process variability is an essential aspect for maintaining the product or service quality. Even though most practitioners believe that an increasing process variability is often a more severe concern than a shift in location, barely a few research paid attention to the cost-efficient monitoring of process variability. Some of the existing studies addressed the dispersion aspect, assuming that the quality characteristic is Gaussian. Non-normal and complex distributions are not uncommon in modern production processes, time to event processes, or processes involving service quality. Unfortunately, we find no literature on economically designed nonparametric (distribution-free) schemes for monitoring process variability. This article introduces two Shewhart-type cost-optimized nonparametric schemes for monitoring the variability of any unknown but continuous processes to fill the research gap. The proposed monitoring schemes are based on two popular two-sample rank statistics for differences in scale parameters, known as the Ansari–Bradley statistic and the Mood statistic. We assess their actual performance for a set of process scenarios and illustrate the design along with the implementation steps. We discuss a practical problem related to product quality management. It is expected that the proposed schemes will be beneficial in various industrial operations.  相似文献   

5.
《Quality Engineering》1997,9(4):xiii-xviii
Statistical process control (SPC) has been used extensively in industry for monitoring the product quality. The variations of measured product characteristics, however, come not only from the manufacturing process but also from the measurement process. Large measurement variability may cause errors in SPC results and, hence, lead to wrong interpretation of the product quality. The objective of this work was to conduct a robust design study to investigate the effects of measurement factors, which would lead to a reduction in the measurement variability. Repeatability and reproducibility studies verified the success of the proposed strategy. Process capability analyses confirmed the net effect of the reduced measurement variability on the product variability.  相似文献   

6.
There is increased interest in industrial experiments aimed at improving quality by reducing the variability of a product characteristic while maintaining the desired mean level of the characteristic. Analysis of treatment effects on variability, however, is more difficult than analyzing the effects on mean performance. In this article we extend the results of Zelen to test for treatment effects on variance when there are two sources of variability, that which exists between production runs or setups and variability within runs. Bootstrap critical values are developed to handle possibly nonnormal errors at either level of variability.  相似文献   

7.
Buyers are faced with selecting the optimal supplier, while suppliers are left to consider production costs. In this study, we developed a two-phase selection framework that allows buyers to evaluate the performance of suppliers while taking production costs into account for value maximisation. This scheme is a win-win solution capable of promoting long-term relationships between buyers and suppliers. Under the assumption of normality, the first phase involves constructing a new Six Sigma quality capability analysis chart (SSQCAC) which takes production costs into account. The objective is to evaluate all potential suppliers using the 100?×?(1–α)% upper confidence limit (UCL) of an integrated Six Sigma quality index (SSQI) QPIh when dealing with products with smaller-the-better (STB), larger-the-better (LTB), or nominal-the-best (NTB) quality characteristics. According to interval estimation theory, this method can have a significant impact on the consumption of resources; i.e. the production costs of the supplier can be decreased by reducing the production quality to below that required by the buyer. The proposed method also filters out unsuitable suppliers in order to simplify the decision problem and reduce computational demands and operational risks/costs without compromising the quality of the final product. In the second phase, a detailed analysis is conducted using Euclidean distance measure to select the optimal supplier from among the remaining candidates. We conducted a real-world case study to evaluate the efficacy of the proposed method. We also conducted comparisons with existing methods to demonstrate the advantages of the proposed method and its managerial implications. Suggestions for future study are also provided.  相似文献   

8.
The upper confidence bound for a product defect rate is a very important index for evaluating the production process in industry. In this paper, we provide a bootstrap methodology to construct a (1?α)100% upper confidence bound for the overall defect rate of a product whose quality assessment involves multiple pass/fail binary data and multiple continuous data. When only the pass/fail data are included we propose using a bootstrap method which is consistent with the Clopper–Pearson one‐sided confidence interval. When only the continuous data are included the BCa bootstrap method is recommended. These two methods are combined to provide an upper confidence bound for the overall defect rate of the product when multiple pass/fail binary data and multiple continuous data are present. All methods are clearly stated in algorithmic form, investigated through simulation and demonstrated using example data sets. In the simulation studies and examples the proposed algorithms show great advantages in both coverage probability and computational efficiency. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

9.
A desirability function approach has been widely used in multi‐response optimization due to its simplicity. Most of the existing desirability function‐based methods assume that the variability of the response variables is stable; thus, they focus mainly on the optimization of the mean of multiple responses. However, this stable variability assumption often does not apply in practical situations; thus, the quality of the product or process can be severely degraded due to the high variability of multiple responses. In this regard, we propose a new desirability function method to simultaneously optimize both the mean and variability of multiple responses. In particular, the proposed method uses a posterior preference articulation approach, which has an advantage in investigating tradeoffs between the mean and variability of multiple responses. It is expected that process engineers can use this method to better understand the tradeoffs, thereby obtaining a satisfactory compromise solution.  相似文献   

10.
《国际生产研究杂志》2012,50(24):7552-7566
This paper considers the integration problem of production, maintenance and quality for a capacitated lot-sizing production system subject to deterioration. The effects of varying operational conditions from batch to batch on system reliability and product quality are modelled by proportional hazards models, resulting in non-monotonic failure rate and defect rate. After each batch production, imperfect preventive maintenance (PM) is determined to mitigate the system deterioration, and inspection is taken to sort nonconforming items in the finished goods. Once the cumulative number of nonconforming items exceeds a predetermined threshold, an overhaul is performed to renew the system. An integrated model for optimising production plan, PM plan and overhaul strategy is developed to minimise the total cost while satisfying all product demands. A genetic algorithm is proposed to solve the integrated model efficiently. Numerical results validate the rationality of the model with varying operational conditions consideration and its applicability in economic benefits.  相似文献   

11.
许烽 《包装学报》2013,5(2):15-19
对氧化淀粉的干法、半干法、湿法3种制备工艺进行了探讨。论述了各工艺的优缺点:干法工艺流程较短、单位产品能耗较低、生产设备较简便,但产品质量不太稳定;半干法工艺虽然增加了一定的生产用水量,但未大量增加工艺流程,且产品质量较高;湿法工艺反应均匀、产品质量较稳定、生产过程较易控制,但存在产品较难回收、生产用水量大,且生产过程中会产生大量污水,给生态环境带来较大压力等缺点。因此,应根据各工艺的缺陷优化氧化淀粉生产工艺,以获得最佳的生产工艺条件。  相似文献   

12.
Robust Parameter Design (RPD) has been used as the primary technique to reduce process and product variability. The offline choice of appropriate control factor settings allows RPD to ensure that noise factors have a minimum influence on responses. In this article, an alternative methodology of automatic process control is proposed, that is, controllable factors are adjusted online based on in-process observations of noise factors. A cautious control strategy, which explicitly considers the observation uncertainty in adjusting the settings of controllable factors, makes the system performance consistently more favorable when compared with the certainty equivalence control strategy and RPD. On the other hand, RPD can be considered a special case of automatic control laws using a constant control setting during production. A case study of a sheet-metal stamping process demonstrates that the implementation of the proposed method in an industrial facility can lead to significant quality improvements.  相似文献   

13.
The optimisation of product infant failure rate is the most important and difficult task for continuous improvement in manufacturing; how to model the infant failure rate promptly and accurately of the complex electromechanical product in manufacturing is always a dilemma for manufacturers. Traditional methods of reliability analysis for the produced product usually rely on limited test data or field failures, the valuable information of quality variations from the manufacturing process has not been fully utilised. In this paper, a multilayered model structured by ‘part-level, component-level, system-level’ is presented to model the reliability in the form of infant failure rate by quantifying holistic quality variations from manufacturing process for electromechanical products. The mechanism through which the multilayered quality variations affect the infant failure rate is modelled analytically with a positive correlation structure. Furthermore, an integrated failure rate index is derived to model the reliability of electromechanical product in manufacturing by synthetically incorporating overall quality variations with Weibull distribution. A case study on a control board suffering from infant failures in batch production is performed. Results show that the proposed approach could be effective in assessing the infant failure rate and in diagnosing the effectiveness of quality control in manufacturing.  相似文献   

14.
In many industrial applications, the quality of a process or product can be characterized by a function or profile. Owing to spatial autocorrelation or time collapse, the assumption of the observations within each profile that are uncorrelated is violated. This paper aims at evaluating the process yield for linear within‐profile autocorrelation. We present an approximate lower confidence bound for SpkA when the observations within each profile follow a first‐order autoregressive AR(1) model. A simulation study is conducted to assess the performance of the proposed method. The simulation results confirm that the proposed method performs well for the bias, the standard deviation, and the coverage rate. One real example is used to demonstrate the applications of the proposed approach. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

15.
Excessive variation in a manufacturing process is one of the major causes of a high defect rate and poor product quality. Therefore, quick detection of changes, especially increases in process variability, is essential for quality control. In modern manufacturing environments, most of the quality characteristics that have to be closely monitored are multivariate by the nature of the applications. In these multivariate settings, the monitoring of process variability is considerably more difficult than monitoring a univariate variance, especially if the manufacturing environment only allows for the collection of individual observations. Some recent charts, such as the MaxMEWMV chart, the MEWMS chart and the MEWMC chart, have been proposed to monitor process variability specifically when the subgroup size is equal to 1. However, these methods do not take into account the engineering and operational understanding of how the process works. That is, when the process variability goes out of control, it is often the case that changes only occur in a small number of elements of the covariance matrix or the precision matrix. In this work, we propose a control charting mechanism that enhances the existing methods via penalised likelihood estimation of the precision matrix when only individual observations are available for monitoring the process variability. The average run length of the proposed chart is compared with that of the MaxMEWMV, MEWMS and MEWMC charts. A real example is also presented in which the proposed chart and the existing charts are applied and compared.  相似文献   

16.
Frequent early failures have been the key factor restricting the reliability improvement of electromechanical product, and also reducing its market competitiveness. Most of the current early failure elimination methods are passive elimination technologies after a failure. Their purpose is to quickly restore the production capacity of products, which is mainly used to improve the reliability of the products, and cannot fundamentally improve the quality of the manufactured product. To address those problems, this paper first analyzed the causes of early failures in the production stage of electromechanical product, and studied the corresponding early failure elimination mechanism; second, according to the FRACAS (Failure Report Analysis and Corrective Action System), an early failure elimination method for electromechanical product was proposed based on the meta-action unit; finally, in order to ensure the smooth implementation of early failure elimination measures, a set of early failure active elimination guarantee management system was established. In addition, this paper also introduced the specific implementation process of the proposed method. The proposed method is applied to engineering, and the results verify its correctness. The proposed method provides a theoretical basis and technical support for the elimination of electromechanical product early failure.  相似文献   

17.
The purpose of this article is to test the performance of a heuristic algorithm that computes a quality control plan. The objective of the tests reported in this paper is twofold: (1) to compare the proposed heuristic algorithm (HA) to an optimal allocation (OA) method; and (2) to analyse the behaviour and limitations of the proposed HA on a scale-1 test with a before/after test. The method employed to evaluate this algorithm is based on comparisons: 1. The first test illustrates the method and its sensitivity to internal parameters. It is based on a simplified case study of a product from the semiconductor industry. The product is made up of 1000, 800 and 1200 wafers incorporating three different technologies. The production duration is 1 week, and three tools were involved in this test. The behaviour of the proposed algorithm is checked throughout the evolution of the model parameters: risk exposure limit (RL ) and measurement capacity (P). The quality control plan for each tool and product are analysed and compared to those from a one stage allocation process (named C 0) that does not take into account risk exposure considerations. A comparison is also performed with OA.

2. The second scale-1 test is based on three scenarios of several months of regular semiconductor production. Data were obtained from 23 etching and 12 photolithographical tools. The outputs provided by the HA are used in the sampling scheduler implemented at this plant. The resulting samples are compared against three indicators.

The results of these comparisons show that, for small instances, OA is more relevant than the HA method. The HA provides realistic limits that are suitable for daily operations. Even though the HA may provide far from optimal results, it demonstrates major MAR improvement. In terms of the maximum inhibit limit, the HA achieves better performances than C 0, and they are strongly correlated to RL and to the control capacity. The article concludes that the proposed algorithm can be used to plan controls and to guide their scheduling. It can also improve the insurance design for several levels of acceptance of risk.  相似文献   

18.
Robust Design is an important method for improving product quality, manufacturability, and reliability at low cost. Taguchi's introduction of this method in 1980 to several major American industries resulted in significant quality improvement in product and manufacturing process design. While the robust design objective of making product performance insensitive to hard-to-control noise was recognized to be very important, many of the statistical methods proposed by Taguchi, such as the use of signal-to-noise ratios, orthogonal arrays, linear graphs, and accumulation analysis, have room for improvement. To popularize me use of robust design among engineers, it is essential to develop more effective, statistically efficient, and user-friendly tech niques and tools. This paper first summarizes the statistical methods for planning and analyzing robust design experiments originally proposed by Taguchi; then reviews newly developed statistical methods and identifies areas and problems where more research is needed. For planning experiments, we review a new experiment format, the combined array format, which can reduce the experiment size and allow greater flexibility for estimating effects which may be more important for physical reasons. We also discuss design strategies, alternative graphical tools and tables, and computer algorithms to help engineers plan more efficient experi ments. For analyzing experiments, we review a new modeling approach, die response model approach, which yields additional information about how control factor settings dampen the effects of individual noise factors; this helps engineers better under stand die physical mechanism of the product or process. We also discuss alternative variability measures for Taguchi's signal-to-noise ratios and develop methods for empirically determining the appropriate measure to use.  相似文献   

19.
Operating the machine with a deteriorated cutting tool often leads to poor product quality performance and high risk of tool failure. Replacing the degraded tool is an effective measure to reduce product quality loss and chance of tool failure. Excessive tool replacements, however, may increase the production capacity loss and tool replacement cost. Taking these factors into consideration, this paper presents an approach for determining the optimal tool replacement time for cutting process. It assumes that the product quality deteriorates as cutting tool wears and tool failure occurs randomly during the cutting process. A product quality failure rate model is developed to characterise the deterioration of product quality during the cutting process, and the product quality loss is estimated based on this model. Weibull distribution is employed to describe the stochastic tool life. A tool replacement model is proposed based on balancing the product quality loss, penalty cost for possible tool failure, production capacity loss and tool replacement cost. Sensitivity analysis of the optimal tool replacement decision is presented.  相似文献   

20.
This work proposes a general approach to study and improve the effectiveness of the system with respect to its expected life-cycle cost rate. The model we propose considers a production system which is protected against demand fluctuations and failure occurrences with elements like stock piles, line and equipment redundancy, and the use of alternative production methods. These design policies allow to keep or minimize the effect on the nominal throughput, while corrective measures are taken. The system is also subject to an aging process which depends on the frequency and quality of preventive actions. Making decisions is difficult because of discontinuities in intervention and downtime costs and the limited budget. We present a non-linear mixed integer formulation that minimizes the expected overall cost rate with respect to repair, overhaul and replacement times and the overhaul improvement factor proposed in the literature. The model is deterministic and considers minimal repairs and imperfect overhauls. We illustrate its application with a case based on a known benchmark example.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号