首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This article considers the design of two‐stage reliability test plans. In the first stage, a bogey test was performed, which will allow the user to demonstrate reliability at a high confidence level. If the lots pass the bogey test, the reliability sampling test is applied to the lots in the second stage. The purpose of the proposed sampling plan was to test the mean time to failure of the product as well as the minimum reliability at bogey. Under the assumption that the lifetime distribution follows Weibull distribution and the shape parameter is known, the two‐stage reliability sampling plans with bogey tests are developed and the tables for users are constructed. An illustrative example is given, and the effects of errors in estimates of a Weibull shape parameter are investigated. A comparison of the proposed two‐stage test with corresponding bogey and one‐stage tests was also performed. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

2.
Degradation tests are alternative approaches to lifetime tests and accelerated lifetime tests in reliability studies. Based on a degradation process of a product quality characteristic over time, degradation tests provide enough information to estimate the time‐to‐failure distribution. Some estimation methods, such as analytical, the numerical or the approximated, can be used to obtain the time‐to‐failure distribution. They are chosen according to the complexity of the degradation model used in the data analysis. An example of the application and analysis of degradation tests is presented in this paper to characterize the durability of a product and compare the various estimation methods of the time‐to‐failure distribution. The example refers to a degradation process related to an automobile's tyre, and was carried out to estimate its average distance covered and some percentiles of interest. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

3.
Products with high reliability and long lifetimes undergo different types of stresses in use conditions. Often there are multiple performance indicators for products that gradually degrade over time. An accelerated degradation test (ADT) with multiple stresses and multiple degradation measures (MSMDM) may provide a more accurate prediction of the lifetime of these products. However, the ADT requires a moderate sample size, which is not practical for newly developed or costly products with only a few available test specimens on hand. Therefore, in this study, a step‐stress ADT (SSADT) with MSMDM is developed. However, it is a difficult endeavor to design SSADT with MSMDM to predict accurate reliability estimation under several constraints. Previous methods are used only for cases with a single stress or degradation measure, and are not suitable for SSADT with MSMDM. In this paper, an approach of optimal design is proposed for SSADT with MSMDM and its steps for a rubber sealed O‐ring are demonstrated to illustrate its validity. Results of the sensitivity analysis for the optimal test plan indicate robustness when the deviation of model parameters is within 10% of the estimated values. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

4.
For a period of mission time, only zero‐failure data can be obtained for high‐quality long‐life products. In the case of zero‐failure data reliability assessment, the point estimates and confidence interval estimates of distribution parameters cannot be obtained simultaneously by the current reliability assessment models, and the credibility of the assessment results may be reduced if they are obtained at the same time. A new model is proposed for consistency problem in this paper. In the proposed model, the point estimates of reliability can be obtained by the lifetime probability distribution derived from matching distribution curve method, while the confidence interval estimates of reliability can be obtained by using new samples generated from the lifetime probability distribution according to parameter bootstrap method. By analyzing the zero‐failure data of the torque motors after real operation, the results show that the new model not only meets the requirements of reliability assessment but also improves the accuracy of reliability interval estimation.  相似文献   

5.
For costly and dangerous experiments, growing attention has been paid to the problem of the reliability analysis of zero‐failure data, with many new findings in world countries, especially in China. The existing reliability theory relies on the known lifetime distribution, such as the Weibull distribution and the gamma distribution. Thus, it is ineffective if the lifetime probability distribution is unknown. For this end, this article proposes the grey bootstrap method in the information poor theory for the reliability analysis of zero‐failure data under the condition of a known or unknown probability distribution of lifetime. The grey bootstrap method is able to generate many simulated zero‐failure data with the help of few zero‐failure data and to estimate the lifetime probability distribution by means of an empirical failure probability function defined in this article. The experimental investigation presents that the grey bootstrap method is effective in the reliability analysis only with the few zero‐failure data and without any prior information of the lifetime probability distribution. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

6.
When lifetimes follow Weibull distribution with known shape parameter, a simple power transformation could be used to transform the data to the case of exponential distribution, which is much easier to analyze. Usually, the shape parameter cannot be known exactly and it is important to investigate the effect of mis‐specification of this parameter. In a recent article, it was suggested that the Weibull‐to‐exponential transformation approach should not be used as the confidence interval for the scale parameter has very poor statistical property. However, it would be of interest to study the use of Weibull‐to‐exponential transformation when the mean time to failure or reliability is to be estimated, which is a more common question. In this paper, the effect of mis‐specification of Weibull shape parameters on these quantities is investigated. For reliability‐related quantities such as mean time to failure, percentile lifetime and mission reliability, the Weibull‐to‐exponential transformation approach is generally acceptable. For the cases when the data are highly censored or when small tail probability is concerned, further studies are needed, but these are known to be difficult statistical problems for which there are no standard solutions. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

7.
Degradation modeling might be an alternative to the conventional life test in reliability assessment for high quality products. This paper develops a Bayesian approach to the step‐stress accelerated degradation test. Reliability inference of the population is made based on the posterior distribution of the underlying parameters with the aid of Markov chain Monte Carlo method. Further sequential reliability inference on individual product under normal condition is also proposed. Simulation study and an illustrative example are presented to show the appropriateness of the proposed method. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

8.
Numerous papers have already reported various results on electrical and optical performances of GaAs‐based materials for optoelectronic applications. Other papers have proposed some methodologies for a classical estimation of reliability of GaAs compounds using life testing methods on a few thousand samples over 10 000 hours of testing. In contrast, fewer papers have studied the complete relation between degradation laws in relation to failure mechanisms and the estimation of lifetime distribution using accelerated ageing tests considering a short test duration, low acceleration factor and analytical extrapolation. In this paper, we report the results for commercial InGaAs/GaAs 935 nm packaged light emitting diodes (LEDs) using electrical and optical measurements versus ageing time. Cumulative failure distributions are calculated using degradation laws and process distribution data of optical power. A complete methodology is described proposing an accurate reliability model from experimental determination of the failure mechanisms (defect diffusion) for this technology. Electrical and optical characterizations are used with temperature dependence, short‐duration accelerated tests (less than 1500 h) with an increase in bias current (up to 50%), a small number of samples (less than 20) and weak acceleration factors (up to 240). Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

9.
In this paper, a coupled reliability method for structural fatigue evaluation considering load shedding is first proposed based on probabilistic fracture mechanics in which the uncertainties of the structural parameters are taken into account. Then, the method is applied to predict the fatigue reliability of the T‐welded structure to the case of considering load shedding or not. The compared results show that by considering the load shedding, the structural fatigue reliability might be improved with less conservativeness. The influence rules of the load‐shedding coefficient on the fatigue failure probability of the T‐welded component are investigated, and some interesting results are obtained. That is, the influences of load‐shedding coefficient on the fatigue failure probability can be divided into three regions, namely the high, medium and low fatigue failure areas. The last area is the most intriguing when we try to design a T‐welded structure. The thickness of T‐welded structure along the crack propagation direction is found to be one of the important design variables for the design of fatigue reliability, in which the low‐fatigue failure zone is used as one of the reliability constraints. The basic design frame of T‐welded structure is established to constrain the fatigue failure probability within the low‐fatigue failure area.  相似文献   

10.
First‐order reliability method (FORM) has been mostly utilized for solving reliability‐based design optimization (RBDO) problems efficiently. However, second‐order reliability method (SORM) is required in order to estimate a probability of failure accurately in highly nonlinear performance functions. Despite accuracy of SORM, its application to RBDO is quite challenging due to unaffordable numerical burden incurred by a Hessian calculation. For reducing the numerical efforts, a quasi‐Newton approach to approximate the Hessian is introduced in this study instead of calculating the true Hessian. The proposed SORM with the approximated Hessian requires computations only used in FORM, leading to very efficient and accurate reliability analysis. The proposed SORM also utilizes a generalized chi‐squared distribution in order to achieve better accuracy. Furthermore, SORM‐based inverse reliability method is proposed in this study. An accurate reliability index corresponding to a target probability of failure is updated using the proposed SORM. Two approaches in terms of finding an accurate most probable point using the updated reliability index are proposed. The proposed SORM‐based inverse analysis is then extended to RBDO in order to obtain a reliability‐based optimum design satisfying probabilistic constraints more accurately even for a highly nonlinear system. The numerical study results show that the proposed reliability analysis and RBDO achieve efficiency of FORM and accuracy of SORM at the same time. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

11.
Many industrial products consist of multiple components that are necessary for system operation. There is an abundance of literature on modeling the lifetime of such components through competing risks models. During the life‐cycle of a product, it is common for there to be incremental design changes to improve reliability, to reduce costs, or due to changes in availability of certain part numbers. These changes can affect product reliability but are often ignored in system lifetime modeling. By incorporating this information about changes in part numbers over time (information that is readily available in most production databases), better accuracy can be achieved in predicting time to failure, thus yielding more accurate field‐failure predictions. This paper presents methods for estimating parameters and predictions for this generational model and a comparison with existing methods through the use of simulation. Our results indicate that the generational model has important practical advantages and outperforms the existing methods in predicting field failures. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

12.
When dealing with practical problems of stress–strength reliability, one can work with fatigue life data and make use of the well‐known relation between stress and cycles until failure. For some materials, this kind of data can involve extremely large values. In this context, this paper discusses the problem of estimating the reliability index R = P(Y < X) for stress–strength reliability, where stress Y and strength X are independent q‐exponential random variables. This choice is based on the q‐exponential distribution's capability to model data with extremely large values. We develop the maximum likelihood estimator for the index R and analyze its behavior by means of simulated experiments. Moreover, confidence intervals are developed based on parametric and nonparametric bootstrap. The proposed approach is applied to two case studies involving experimental data: The first one is related to the analysis of high‐cycle fatigue of ductile cast iron, whereas the second one evaluates the specimen size effects on gigacycle fatigue properties of high‐strength steel. The adequacy of the q‐exponential distribution for both case studies and the point and interval estimates based on maximum likelihood estimator of the index R are provided. A comparison between the q‐exponential and both Weibull and exponential distributions shows that the q‐exponential distribution presents better results for fitting both stress and strength experimental data as well as for the estimated R index. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

13.
Thanks to continuously advancing technology and manufacturing processes, the products and devices are becoming highly reliable. However, performing the life tests of these products at normal operating conditions becomes extremely difficult, if not impossible, due to their long life spans. This can result in missed opportunities to introduce the products to the market in a timely manner and eventually loss of the market share. This problem is solved by accelerated life tests where the test units are subjected to higher stress levels than the normal usage level so that information on the lifetime parameters can be obtained more quickly. The lifetime at the design condition is then estimated through extrapolation using a regression model. In this work, the design optimization of a simple step‐stress accelerated life test under progressive type I censoring is studied with nonuniform step durations for assessing the reliability characteristics of a solar lighting device. Allowing the intermediate censoring to take place at the stress change time point, the nature of the optimal stress duration is demonstrated under various design criteria including D‐optimality, C‐optimality, A‐optimality, and E‐optimality. The existence of these optimal designs is investigated in detail for exponential lifetimes with a single stress variable, and the effect of the intermediate censoring proportion is assessed on the design efficiency.  相似文献   

14.
A new technique for reliability and quality optimization of electronic components and assemblies, the so called in situ accelerated ageing technique with electrical testing, is presented. This technique is extremely useful for the building-in approach to quality and reliability. First, it can be used to optimize an electronic component or assembly with respect to its quality and reliability performance at a very early stage, i.e. at the design level, at the level of materials selection, and at the level of identifying production techniques and defining production parameters. The typical test time is of the order of 24 hours, which is sufficiently short to allow a design of experiments type approach to quality and reliability optimization. Furthermore, the technique is also very useful for obtaining a deeper understanding of the physico-chemical processes which lead to failure. A number of practical examples where the technique has been successfully applied are discussed.  相似文献   

15.
Reliability optimization is an important and challenging topic both in engineering and industrial situations as its objective is to design a highly reliable system that operates more safely and efficiently under constraints. Redundancy allocation problem (RAP), as one of the most well‐known problems in reliability optimization, has been the subject of many studies over the past few decades. RAP aims to find the best structure and the optimal redundancy level for each subsystem. The main goal in RAP is to maximize the overall system reliability considering some constraints. In all the previous RAP studies, the reliability of the components is considered constant during the system's mission time. However, reliability is time‐dependent and needs to be considered and monitored during the system's lifetime. In this paper, the reliability of components is considered as a function of time, and the RAP is reformulated by introducing a new criterion called ‘mission design life’ defined as the integration of the system reliability function during the mission time. We propose an efficient algorithm for this problem and demonstrate its performance using two examples. Furthermore, we demonstrate the importance of the new approach using a benchmark problem in RAP. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

16.
In reliability engineering, load sharing is typically associated with a system in parallel configuration. Examples include bridge support structures, electric power supply systems, and multiprocessor computing systems. We consider a reliability maximization problem for a high‐voltage commutation device, wherein the total voltage across the device is shared by the components in series configuration. Here, the increase of the number of load‐sharing components increases component–level reliability (as the voltage load per component reduces) but may decrease system–level reliability (because of the increased number of components in series). We provide the solution for the 2 popular life‐load models: the proportional hazard and the accelerated failure time models with the underlying exponential and Weibull distributions for both a single and dual failure modes.  相似文献   

17.
In this article, the authors present a general methodology for age‐dependent reliability analysis of degrading or ageing components, structures and systems. The methodology is based on Bayesian methods and inference—its ability to incorporate prior information and on ideas that ageing can be thought of as age‐dependent change of beliefs about reliability parameters (mainly failure rate), when change of belief occurs not only because new failure data or other information becomes available with time but also because it continuously changes due to the flow of time and the evolution of beliefs. The main objective of this article is to present a clear way of how practitioners can apply Bayesian methods to deal with risk and reliability analysis considering ageing phenomena. The methodology describes step‐by‐step failure rate analysis of ageing components: from the Bayesian model building to its verification and generalization with Bayesian model averaging, which as the authors suggest in this article, could serve as an alternative for various goodness‐of‐fit assessment tools and as a universal tool to cope with various sources of uncertainty. The proposed methodology is able to deal with sparse and rare failure events, as is the case in electrical components, piping systems and various other systems with high reliability. In a case study of electrical instrumentation and control components, the proposed methodology was applied to analyse age‐dependent failure rates together with the treatment of uncertainty due to age‐dependent model selection. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

18.
In this paper, we demonstrate how we set up and executed an integrated reliability engineering process with the engineering team in the Light Duty (LD) Dodge Ram (DR) Truck chassis program. Organizationally, the LD DR chassis team consists of core engineering groups and supporting representatives from other related disciplines. The reliability engineer is a member of the team and he is the reliability advocate and leader. The integrated reliability engineering process was customized and implemented in the LD DR chassis program. Many of the tools developed by the company corporate quality office, such as design failure modes and effect analysis (DFMEA), design verification plan and reporting (DVP&R), and finite element analysis (FEA), were used in supporting the reliability engineering process. An array of technical enablers such as Test Matrix and ReliUp were also developed for supporting the implementation of the reliability engineering process. In the execution, the reliability engineer led the engineering team to set up reliability targets, develop reliability work plans, facilitate up‐front design analysis, review and integrate reliability test planning. The reliability engineer also set up the failure reporting, analysis and corrective action system (FRACAS) and managed reliability growth with the team. From the implementation, we have learned several things: (1) an integrated engineering team is crucial in order to develop a product better, quicker and cheaper; (2) a good team leader is the key to product reliability; (3) a capable reliability engineer is the catalyst to a reliability engineering process; (4) the best culture in which to achieve reliability is the delicate balance between ‘inside‐out’ and ‘outside‐in’; and (5) achieving reliability is far more important than measuring reliability. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

19.
A Bayes approach is proposed to improve product reliability prediction by integrating failure information from both the field performance data and the accelerated life testing data. It is found that a product's field failure characteristic may not be directly extrapolated from the accelerated life testing results because of the variation of field use condition that cannot be replicated in the lab‐test environment. A calibration factor is introduced to model the effect of uncertainty of field stress on product lifetime. It is useful when the field performance of a new product needs to be inferred from its accelerated life test results and this product will be used in the same environment where the field failure data of older products are available. The proposed Bayes approach provides a proper mechanism of fusing information from various sources. The statistical inference procedure is carried out through the Markov chain Monte Carlo method. An example of an electronic device is provided to illustrate the use of the proposed method. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

20.
Abstract

Although dc current gain has been widely treated, its relation to device reliability is still unavailable. We show here that the variation of current gain of bipolar junction transistors depends upon device operating time and may be used as a failure indicator in case no failure occurs in a test. From the accelerated test data with zero failure, a method for the computation of device age is illustrated and compared with traditional methods. Owing to the composite nature of current gain, we obtain an empirical formula which is useful in providing us a quantitative anticipation of device life. A current gain degradation mechanism is also discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号