首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Economic design of multivariate exponentially weighted moving average (MEWMA) control charts for monitoring the process mean vector involves determining economically the optimum values of the three control parameters: the sample size, the sampling interval between successive samples, and the control limits or the critical region of the chart. In the economic-statistical design, constraints (including the requirements of type I error probability and power) are added such that the statistical property of the chart is satisfied. In this paper, using the multivariate Taguchi loss approach, the Lorenzen–Vance (Technometrics 28:3-10, 1) cost function of implementing the control chart is extended to include intangible external costs along with the in-control average run length (ARL0) and out-of-control average run length (ARL1) as statistical constraints. A Markov chain model is then developed to estimate the ARLs and a genetic algorithm whose parameters are optimally obtained by design of experiments is used to solve the model and estimate the optimum values of the control chart parameters. A numerical example and a sensitivity analysis are provided to illustrate the solution procedure and to investigate the effects of cost parameters on the optimal designs. The results show that the proposed economic-statistical design of the chart has better statistical properties in comparison to the economic design while the difference between the costs is negligible.  相似文献   

2.
In this study, an internal polishing system using magnetic force was developed for the production of ultra-clean tubes with average surface roughness ranging from 0.02 _m to 0.05 _m or less, and the application of magnetic abrasives composed of WC/Co powder was developed. After finding the optimal conditions, machining characteristics using newly developed abrasives ware analysed. From the results obtained by the experimental design method, optimal polishing conditions were analysed.  相似文献   

3.
Traditional multivariate control charts such as Hotelling’s χ 2 and T 2 control charts are designed to monitor vectors of variable quality characteristics. However, in certain situations, data are expressed in linguistic terms and, under these circumstances, variable or attribute multivariate control charts are not suitable choices for monitoring purposes. Fuzzy multivariate control charts such as fuzzy Hotelling’s T 2 could be considered as efficient tools to overcome the problems of linguistic observations. The purpose of this paper is to develop a fuzzy multivariate exponentially weighted moving average (F-MEWMA) control chart. In this paper, multivariate statistical quality control and fuzzy set theory are combined to develop the proposed method. Fuzzy sets and fuzzy logic are powerful mathematical tools for modeling uncertain systems in industry, nature, and humanity. Through a numerical example, the performance of the proposed control chart was compared to the fuzzy Hotelling’s T 2 control chart. Results indicate uniformly superior performance of the F-MEWMA control chart over Hotelling’s T 2 control chart.  相似文献   

4.
When an x control chart is used to monitor a process, three parameters should be determined: the sample size, the sampling interval between successive samples, and the control limits for the chart. Duncan presented a cost model to determine the three parameters for an x chart, which is called the economic design of the x chart. In this paper, the Burr distribution is employed to conduct the economic statistical design of the x charts for non-normal data. Duncan’s cost model is used as the objective function, and the cumulative function of the Burr distribution is applied to derive the statistical constraints of the problem. An example is presented to illustrate the solution procedure. From the results of the sensitivity analyses of this example, we find that the skewness coefficient of the underlying population has no effect on the optimal sample size; however, a larger value of skewness coefficient leads to a slightly smaller sampling interval and a narrower control limit. Also, an increase in kurtosis coefficient results in an increase in the sample size, a slight increase in sampling interval, and a wider control limit.  相似文献   

5.
Ultrasound waves are used to measure the concentration levels of mica and glass, simultaneously, in polypropylene polymer compounds. A test chamber was designed to accommodate a 1 MHz ultrasound transducer for static calibration of the system response to various levels of glass and mica concentration in the polymer blend. Temperature and pressure calibrations of the transducer response were also performed under controlled experiments. Attenuation and time-of-flight measurements of ultrasound waves propagating through the polymer blend were used to determine the relative concentrations of the fillers. The experimental results show that the system is capable of measuring the filler concentration of mica and glass fibres within _0.5% and _1.0%, respectively.  相似文献   

6.
This paper presents results obtained from the grinding of aluminium-based metal matrix composites reinforced with either aluminium oxide (Al2O3) or silicon carbide (SiC) particles using grinding wheels made of SiC in a vitrified matrix or diamond in a resin-bonded matrix. The study used grinding speeds of 1100–2200 m min-1 , a grinding depth of 15 _m for rough grinding and 0.1–1 _m for fine grinding, a crossfeed of 3 mm and 1 mm for rough and fine grinding, respectively,while maintaining a constant table feedrate of 20.8 m min -1 . Surface integrity of the ground surfaces and subsurfaces was analysed using a scanning electron microscope and a profilometer. Grinding using a 3000-grit diamond wheel at depths of cut of 1 _m and 0.5 _m produced ductile streaks on the Al2O3 particles and the SiC particles, respectively. There was almost no subsurface damage except for rare cracked particles when fine grinding with the diamond wheel.  相似文献   

7.
When an x control chart is used to monitor a manufacturing process, three parameters should be determined: the sample size, the sampling interval between successive samples, and the control limits for the chart. In 1956, Duncan presented the first cost model to determine the three parameters for the x charts, which is called the economic design of x charts. Traditionally, when designing a x chart, it is assumed that the measurements within a sample are independently distributed; however, this assumption may not be tenable. In this paper, we develop the economic design of x charts for correlated measurements within a sample. An example is presented to illustrate the solution procedure. From the results of the sensitivity analyses of this example, we find that if the measurements in the sample are positively correlated, highly correlated data result in a smaller sample size, a frequent sampling interval and narrower control limits; however, if the measurements in the sample are negatively correlated, highly correlated data yield a smaller sample size and narrower control limits.  相似文献   

8.
The cumulative sum scheme (CUSUM) and the adaptive control chart are two approaches to improve chart performance in detecting process shifts. A weighted loss function CUSUM scheme (WLC) is able to monitor both the mean shift and the increasing variance shift by manipulating a single chart. This paper investigates the WLC scheme with a variable sample sizes (VSS) feature. A design procedure is firstly proposed for the VSS WLC scheme. Then, the performance of the chart is compared with that of four other competitive control charts. The results show that the VSS WLC scheme is more powerful than the other charts from an overall viewpoint. More importantly, the VSS WLC scheme is simpler to design and operate. A case study in the manufacturing industry is used to illustrate the chart application. The proposed VSS WLC scheme suits the scenario where the strategy of varying sample sizes is feasible and preferable to pursue a high capability of detecting process variations.  相似文献   

9.
Control charts are widely implemented in firms to establish and maintain statistical control of a process which leads to the improved quality and productivity. Therefore, designs of control charts have gained particular attention from the outset. Design of control charts requires that the engineer selects a sample size, a sampling frequency, and the control limits for the chart. In this paper, a possible combination of design parameters is considered as a decision-making unit which is identified by three attributes: hourly expected cost, detection power of the chart, and in-control average run length. Subsequently, optimal design of control charts is formulated as a multiple-objective decision-making problem. Moreover, the cost function is extended from single to multiple assignable causes because there exist multiple assignable causes in real practice. An algorithm using data envelopment analysis is applied to solve the multiple-objective decision-making (MODM) model. Some numerical and experimental analyses are provided to illustrate the algorithm procedure. Sensitivity analysis is carried out to investigate the robustness of the model, and comparisons with other related published papers are made. It is shown that the proposed MODM model can overcome some drawbacks attached to the previous models and approaches.  相似文献   

10.
Since Reynolds et al. [1] proposed the variable sampling interval (VSI) control chart, the statistical properties of the chart have been discussed and shown to be better than the traditional control chart in which the sampling intervals length are fixed. However, the VSI charts are, like the traditional charts, still costly when they are used for the prevention of defective products. For this reason, an appropriate design is necessary before it is used. In this paper, a VSI control chart is used for monitoring non-normal process data, which is widely encountered in practice. Then, a cost-quality model based on the Burr distribution is proposed for constructing an economic-statistical design for the VSI chart. The design of the chart has been developed considering the optimisation of the cost function in this model posed by statistical constraints. The design parameters can be derived through the evolutionary search method, and the overall finding indicates that the designed VSI chart always outperforms the traditional control chart with respect to the expected cost per unit time. This model is also suitable for normally distributed data; thus, it is suitable for a general application.  相似文献   

11.
Variable sampling interval (VSI) control charts have been introduced with the aim of improving performance of traditional control charts. Usually, in the economic–statistical design of the VSI $ \overline{X} $ control charts, it is assumed that observations are normally distributed and process is subjected to only one assignable cause. However, in practice these assumptions could easily fail to hold, and results no longer could be realistic. This paper considers non-normal observations for the case of multiple assignable causes to develop a cost model for the economic design of VSI $ \overline{X} $ control chart. Being more applicable for all types of distributions, Burr distribution is employed for representing the distribution of non-normal process data. Since the proposed design consists of a complex nonlinear cost function that cannot be solved using a classical optimization method, genetic algorithm (GA) searching method as an efficient famous metaheuristic is employed to find the optimal values for the design parameters. Moreover, to improve the performances, response surface methodology is employed to calibrate GA parameters. The effectiveness of the proposed scheme is evaluated through a numerical example. Sensitivity analysis is also carried out to show the effects of cost and process parameters on the outputs of the model. Results show that in all cases, presented VSI model has better economical and statistical performances than its corresponding fixed sampling interval scheme.  相似文献   

12.
Economic control chart models usually assume that the time to occurrence of an assignable cause follows an exponential or Weibull distribution. This paper extends that to the Pareto distribution in order to investigate, in general, the effect on the economic control chart parameters like sample size, time between two successive samples, and the cost per unit time of the distributional assumption. The Pareto distribution arises as a limiting distribution of the waiting time for the number of new observations needed to obtain a value exceeding the greatest among “n” observations. It was found that the economic design of $ \overline {\text{X}} $ chart is greatly influenced by the distributional assumption. Using the cost model, the sensitivity analysis of the statistical economic design of the $ \overline {\text{X}} $ chart with respect to the parameters and costs is studied.  相似文献   

13.
Bootstrap method approach in designing multi-attribute control charts   总被引:1,自引:0,他引:1  
In a production process, when the quality of a product depends on more than one correlated characteristic, multivariate quality control techniques are used. Although multivariate statistical process control is receiving increased attention in the literature, little work has been done to deal with multi-attribute processes. In monitoring the quality of a product or process in multi-attribute environments in which the attributes are correlated, several issues arise. For example, a high number of false alarms (type I error) occur and the probability of not detecting defects (type II error) increases when the process is monitored by a set of independent uni-attribute control charts. In this paper, to overcome these problems, first we develop a new methodology to derive control limits on the attributes based on the bootstrap method in which we build simultaneous confidence intervals on the attributes. Then, based upon the in-control and out-of-control average run length criteria we investigate the performance of the proposed method and compare it with the ones from the Bonferroni and Sidak’s procedure using simulation. The results of the simulation study show that the proposed method performs better than the other two methods. At the end, we compare the bootstrap method with the T 2 control chart for attributes.  相似文献   

14.
A Strategic Decision Model for the Justification of Technology Selection   总被引:1,自引:1,他引:0  
In this paper a new approach to the decision making on technology selection is proposed. In this approach a strategic decision-making model is used in which the tangible benefits of a technology are evaluated by addressing both cost and time dimensions, and the intangible benefits are evaluate d using the analytical hierarchy process (AHP). In AHP, experts in the functional area give judgemental values required for the comparison matrices. However, the opinions of the experts may deviate, and also a single estimate is not realistic, hence, in this approach of evaluating alternative technology, a range of judgemental values are taken and three different levels are considered in the range. The change in the preference level of the alternatives with the change in objective factor weightage ( _), and the range of _ at which the transition in the choice of technology takes place, are analysed, which assists the decision maker. ID="A1"Correspondance and offprint requests to: Dr M. Punniyamoorthy, Department of Management Studies, Regional Engineering College, Tiruchirrapalli 620015, Tamil Nadu, India. E-mail: puniya@rect.ernet.in  相似文献   

15.
Wen and Mergen have presented a method of determining the optimum process mean for a poor process. Wen and Mergen balance the costs of products out-of-specification by setting the optimum process mean in the short term. However, they have not considered the quality loss for products within specification in the model. In this paper, a modified Wen and Mergen’s cost model is proposed with linear and quadratic asymmetrical quality loss of products within specification for determining the optimum process mean. Two specific conditions are considered: 1. The process standard deviation is proportional to the process mean (constant coefficient of variation). 2. The autocorrelated process. ID="A1"Correspondance and offprint requests to: Dr C.-H. Chen, Department of Industrial Management, South Taiwan University of Technology, 1 Nan-Tai Street, Yung-Kang City, Tainan 710, Taiwan. E-mail: chench@mail.stut.edu.tw  相似文献   

16.
The evolutionary tolerance design strategy and its characteristics are studied on the basis of automation technology in the product structure design. To guarantee a successful transformation from the functional requirement to geometry constraints between parts, and finally to dimension constraints, a functional tolerance design theory in the process of product growth design is put forward. A mathematical model with a correlated sensitivity function between cost and the tolerance is created, in which the design cost, the manufacturing cost, the usage cost, and the depreciation cost of the product are regarded as control constraints of the tolerance allocation. Considering these costs, a multifactor-cost function to express quality loss of the product is applied into the model. In the mathematical model, the minimum cost is used as the objective function; a reasonable process capability index, the assembly function, and assembly quality are taken as the constraints; and depreciation cost in the objective function is expressed as the discount rate—terminology in economics. Thus, allocation of the dimension tolerance as the function and cost over the whole lifetime of the product is realized. Finally, a design example is used to demonstrate the successful application of the proposed functional tolerance theory in the incremental growth design of the product. __________ Translated from Chinese Journal of Mechanical Engineering, 2006, 42(10): 73–79 [译自: 机械工程学报]  相似文献   

17.
An Intelligent Knowledge-Based System for Product Cost Modelling   总被引:3,自引:1,他引:2  
An intelligent knowledge-based system for product cost modellingis presented in this paper. The developed system has the capability of selecting a material, as well as machining processes and parameters based on a set of design and production parameters; and of estimating the product cost throughout the entire product development cycle including assembly cost. The proposed system is applied without the need for detailed design information, so that it can be used at an early design stage, and, consequently, redesign cost and longer lead time can be avoided. Hybrid knowledge representation techniques, such as production rules, frame and object oriented are employed to represent manufacturing knowledge. Fuzzy logic-based knowledge representation is applied to deal with uncertainty in the knowledge of cost model to generate reliable cost estimation. This paper deals with cost modelling of both a machining component and an injection moulding component, which is a process that gives high production rates, excellent quality and accuracy of products, and low manufacturing cost. Based on the analysis of the moulded product life cycle, a computer-based cost model was developed which integrated the relationship between cost factors, product development activities, and product geometry. The estimated cost included the costs of material, mould and processing. The system has been validated through a case study.  相似文献   

18.
The quality loss function proposed by Taguchi provides a quantitative measurement of product quality when product’s quality characteristic value deviates from the ideal target at an arbitrary time. However, product use causes degradation on its quality characteristic, and since such a deviation can be changing over time, so can its quality loss. The quality loss caused by degradation on quality characteristic has not been considered in most research. In this paper, the time value of money for quality loss and product degradation over time is integrated into the total cost model, and a new optimization model for the tolerance design of products with correlated characteristics is established. The discussions focus on the multivariate quality loss function as an extension of the Taguchi loss function, which is used to model quality loss due to product degradation as a continuous cash flow function under continuous compounding. The optimal tolerance design is achieved by minimizing the total cost, which is the sum of manufacturing cost and the present worth of expected quality loss. An illustrative example is presented to demonstrate the effectiveness of the proposed model.  相似文献   

19.
Economic design of a control chart involves determining its basic parameters such that a cost function is minimized. This design when statistical performance measures are also considered is referred to as the economic-statistical design. In this paper, a simplex-based Nelder–Mead algorithm is used in combination with a particle swarm meta-heuristic procedure to solve both the economic and economic-statistical designs of a MEWMA control chart. The application results on extensive simulation experiments show that the particle swarm can lead the Nelder–Mead algorithm to better results. Furthermore, a comparative study is performed on the performances of three different algorithms of the Nelder–Mead, the particle swarm optimization (PSO), and the hybrid PSO and Nelder–Mead (PSO–NM). In this study, five different performance measures are taken into consideration and the results for both the economic and the economic-statistical models are reported at the end.  相似文献   

20.
The CUSUM charts have been widely used in statistical process control (SPC) across industries for monitoring process shifts and supporting online measurement and distributed computing. This paper proposes an algorithm for the optimimal design of a CUSUM control chart detecting process shifts in the mean value. The algorithm optimizes the sample size, sampling interval, control limit and reference parameter of the CUSUM chart through minimizing the overall mean value (ML) of a Taguchi’s loss function over the probability distribution of the random process mean shift. A new feature related to the exponential of the sample mean shift is elaborated. Comparative studies reveal that the proposed ML-CUSUM chart is considerably superior to the Shewhart ML- $\overline{X} $ chart and the conventional CUSUM chart in terms of the overall loss of ML.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号