首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Degradation experiments are usually used to assess the lifetime distribution of highly reliable products, which are not likely to fail under the traditional life tests or accelerated life tests. In such cases, if there exist product characteristics whose degradation over time can be related to reliability, then collecting ‘degradation data’ can provide information about product reliability. In general, the degradation data are modeled by a nonlinear regression model with random coefficients. If we can obtain the estimates of parameters under the model, then the failure‐time distribution can be estimated. In order to estimate those parameters, three basic methods are available, namely, the analytical, numerical and the approximate. They are chosen according to the complexity of the degradation path model used in the analysis. In this paper, the numerical and the approximate methods are compared in a simulation study, assuming a simple linear degradation path model. A comparison with traditional failure‐time analysis is also performed. The mean‐squared error of the estimated 100pth percentile of the lifetime distribution is evaluated for each one of the approaches. The approaches are applied to a real degradation data set. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

2.
Numerous papers have already reported various results on electrical and optical performances of GaAs‐based materials for optoelectronic applications. Other papers have proposed some methodologies for a classical estimation of reliability of GaAs compounds using life testing methods on a few thousand samples over 10 000 hours of testing. In contrast, fewer papers have studied the complete relation between degradation laws in relation to failure mechanisms and the estimation of lifetime distribution using accelerated ageing tests considering a short test duration, low acceleration factor and analytical extrapolation. In this paper, we report the results for commercial InGaAs/GaAs 935 nm packaged light emitting diodes (LEDs) using electrical and optical measurements versus ageing time. Cumulative failure distributions are calculated using degradation laws and process distribution data of optical power. A complete methodology is described proposing an accurate reliability model from experimental determination of the failure mechanisms (defect diffusion) for this technology. Electrical and optical characterizations are used with temperature dependence, short‐duration accelerated tests (less than 1500 h) with an increase in bias current (up to 50%), a small number of samples (less than 20) and weak acceleration factors (up to 240). Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

3.
Some life tests are terminated with few or no failures. In such cases, a recent approach is to obtain degradation measurements of product performance that may contain some useful information about product reliability. Generally degradation paths of products are modeled by a nonlinear regression model with random coefficients. If we can obtain the estimates of parameters under the model, then the time‐to‐failure distribution can be estimated. In some cases, the patterns of a few degradation paths are different from those of most degradation paths in a test. Therefore, this study develops a weighted method based on fuzzy clustering procedure to robust estimation of the underlying parameters and time‐to‐failure distribution. The method will be studied on a real data set. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

4.
Based on failures of a parallel‐series system, a new distribution called geometric‐Poisson‐Rayleigh distribution is proposed. Some properties of the distribution are discussed. A real data set is used to compare the new distribution with other 6 distributions. The progressive‐stress accelerated life tests are considered when the lifetime of an item under use condition is assumed to follow the geometric‐Poisson‐Rayleigh distribution. It is assumed that the scale parameter of the geometric‐Poisson‐Rayleigh distribution satisfies the inverse power law such that the stress is a nonlinear increasing function of time and the cumulative exposure model for the effect of changing stress holds. Based on type‐I progressive hybrid censoring with binomial removals, the maximum likelihood and Bayes (using linear‐exponential and general entropy loss functions) estimation methods are considered to estimate the involved parameters. Some point predictors such as the maximum likelihood, conditional median, best unbiased, and Bayes point predictors for future order statistics are obtained. The Bayes estimates are obtained using Markov chain Monte Carlo algorithm. Finally, a simulation study is performed, and numerical computations are performed to compare the performance of the implemented methods of estimation and prediction.  相似文献   

5.
This paper presents a design stage method for assessing performance reliability of systems with multiple time‐variant responses due to component degradation. Herein the system component degradation profiles over time are assumed to be known and the degradation of the system is related to component degradation using mechanistic models. Selected performance measures (e.g. responses) are related to their critical levels by time‐dependent limit‐state functions. System failure is defined as the non‐conformance of any response and unions of the multiple failure regions are required. For discrete time, set theory establishes the minimum union size needed to identify a true incremental failure region. A cumulative failure distribution function is built by summing incremental failure probabilities. A practical implementation of the theory can be manifest by approximating the probability of the unions by second‐order bounds. Further, for numerical efficiency probabilities are evaluated by first‐order reliability methods (FORM). The presented method is quite different from Monte Carlo sampling methods. The proposed method can be used to assess mean and tolerance design through simultaneous evaluation of quality and performance reliability. The work herein sets the foundation for an optimization method to control both quality and performance reliability and thus, for example, estimate warranty costs and product recall. An example from power engineering shows the details of the proposed method and the potential of the approach. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

6.
Accelerated Degradation Tests: Modeling and Analysis   总被引:4,自引:0,他引:4  
High reliability systems generally require individual system components having extremely high reliability over long periods of time. Short product development times require reliability tests to be conducted with severe time constraints. Frequently few or no failures occur during such tests, even with acceleration. Thus, it is difficult to assess reliability with traditional life tests that record only failure times. For some components, degradation measures can be taken over time. A relationship between component failure and amount of degradation makes it possible to use degradation models and data to make inferences and predictions about a failure-time distribution. This article describes degradation reliability models that correspond to physical-failure mechanisms. We explain the connection between degradation reliability models and failure-time reliability models. Acceleration is modeled by having an acceleration model that describes the effect that temperature (or another accelerating variable) has on the rate of a failure-causing chemical reaction. Approximate maximum likelihood estimation is used to estimate model parameters from the underlying mixed-effects nonlinear regression model. Simulation-based methods are used to compute confidence intervals for quantities of interest (e.g., failure probabilities). Finally we use a numerical example to compare the results of accelerated degradation analysis and traditional accelerated life-test failure-time analysis.  相似文献   

7.
8.
This article considers the design of two‐stage reliability test plans. In the first stage, a bogey test was performed, which will allow the user to demonstrate reliability at a high confidence level. If the lots pass the bogey test, the reliability sampling test is applied to the lots in the second stage. The purpose of the proposed sampling plan was to test the mean time to failure of the product as well as the minimum reliability at bogey. Under the assumption that the lifetime distribution follows Weibull distribution and the shape parameter is known, the two‐stage reliability sampling plans with bogey tests are developed and the tables for users are constructed. An illustrative example is given, and the effects of errors in estimates of a Weibull shape parameter are investigated. A comparison of the proposed two‐stage test with corresponding bogey and one‐stage tests was also performed. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

9.
As a key aircraft component, hydraulic piston pumps must be developed with high reliability. However, collecting failure time data of such pumps for reliability analysis is a big challenge. To save testing time, performance degradation data obtained from degradation tests can be used for quick reliability estimation of hydraulic piston pumps. This paper proposes an engineering‐driven performance degradation analysis method considering the nature of mechanical wear of hydraulic piston pumps. First, the failure mechanism of a type of hydraulic piston pump is investigated. By taking into account the close relationship between the degradation rate and the failure mechanism, an inverse Gaussian (IG) process model with a variable rate is developed to describe the degradation behavior of the pump. Under this model, a Bayesian statistical method is developed for degradation data analysis. The corresponding procedure for model parameter estimation and reliability evaluation is also presented. The proposed degradation analysis method is illustrated using a real experimental data. The results show that the engineering‐driven approach is quite effective in evaluating the lifetime of the hydraulic piston pump and will improve the overall reliability of aircraft operation in the field.  相似文献   

10.
Today, in reliability analysis, the most used distribution to describe the behavior of electronic products under voltage profiles is the Weibull distribution. Nevertheless, the Weibull distribution does not provide a good fit to lifetime datasets that exhibit bathtub‐shaped or upside‐down bathtub–shaped (unimodal) failure rates, which are often encountered in the reliability analysis of electronic devices. In this paper, a reliability model based on the beta‐Weibull distribution and the inverse power law is proposed. This new model provides a better approach to model the performance and fit of the lifetimes of electronic devices. To estimate the parameters of the proposed model, a Bayesian analysis is used. A case study based on the lifetime of a surface mounted electrolytic capacitor is presented, the results showed that the estimation of the proposed model differs from the inverse power law–Weibull and that it affects directly the mean time to failure, the failure rate, the behavior, and the performance of the capacitor under analysis.  相似文献   

11.
Lower percentiles of product lifetime are useful for engineers to understand product failure, and avoiding costly product failure claims. This paper proposes a percentile re‐parameterization model to help reliability engineers obtain a better lower percentile estimation of accelerated life tests under Weibull distribution. A log transformation is made with the Weibull distribution to a smallest extreme value distribution. The location parameter of the smallest extreme value distribution is re‐parameterized by a particular 100pth percentile, and the scale parameter is assumed to be nonconstant. Maximum likelihood estimates of the model parameters are derived. The confidence intervals of the percentiles are constructed based on the parametric and nonparametric bootstrap method. An illustrative example and a simulation study are presented to show the appropriateness of the method. The simulation results show that the re‐parameterization model performs better compared with the traditional model in the estimation of lower percentiles, in terms of Relative Bias and Relative Root Mean Squared Error. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

12.
Reliability modeling of fault‐tolerant systems subject to shocks and natural degradation is important yet difficult for engineers, because the two external stressors are often positively correlated. Motivated by the fact that most radiation‐induced failures are contributed from these two external stressors, a degradation‐shock‐based approach is proposed to model the failure process. The proposed model accommodates two kinds of failure modes: hard failure caused by shocks and soft failure caused by degradation. We consider a generalized m–δ shock model for systems with fault‐tolerant design: failure occurs if the time lag between m sequential shocks is less than δ hours or degradation crosses a critical threshold. An example concerning memory chips used in space is presented to demonstrate the applicability of the proposed model. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

13.
For a period of mission time, only zero‐failure data can be obtained for high‐quality long‐life products. In the case of zero‐failure data reliability assessment, the point estimates and confidence interval estimates of distribution parameters cannot be obtained simultaneously by the current reliability assessment models, and the credibility of the assessment results may be reduced if they are obtained at the same time. A new model is proposed for consistency problem in this paper. In the proposed model, the point estimates of reliability can be obtained by the lifetime probability distribution derived from matching distribution curve method, while the confidence interval estimates of reliability can be obtained by using new samples generated from the lifetime probability distribution according to parameter bootstrap method. By analyzing the zero‐failure data of the torque motors after real operation, the results show that the new model not only meets the requirements of reliability assessment but also improves the accuracy of reliability interval estimation.  相似文献   

14.
With the increase of product reliability, collecting time‐to‐failure data is becoming difficult, and degradation‐based method has gained popularity. In this paper, a novel multi‐hidden semi‐Markov model is proposed to identify degradation and estimate remaining useful life of a system. Multiple fused features are used to describe the degradation process so as to improve the effectiveness and accuracy. The similarities of the features are depicted by a new variable combined with forward and backward variables to reduce computational effort. The degradation state is identified using modified Viterbi algorithm, in which linear function is adopted to describe the contribution of each feature to the state recognition. Subsequently, the remaining useful life can be forecasted by backward recursive equations. A case study is presented, and the results demonstrate the validity and effectiveness of the proposed method. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

15.
The aim of this paper is to investigate the issue of degradation modeling and reliability assessment for products under irregular time‐varying stresses. Conventional degradation models have been extensively used in the relevant literature to characterize degradation processes under deterministic stresses. However, the time‐varying stress, which may affect degradation processes, widely exists in field conditions. This paper extends the general degradation‐path model by considering the effects of time‐varying stresses. The new degradation‐path model captures influences of varying stresses on performance characteristics. A nonlinear least square method is used to estimate the unknown parameters of the proposed model. A bootstrap algorithm is adopted for computing the confidence intervals of the mean time to failure and percentiles of the failure‐time distribution. Finally, a case study of lithium‐ion cells is presented to validate the proposed method. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

16.
Degradation testing is an effective tool for evaluating the reliability of highly reliable products. There have been many data collection methods proposed in the literature. Some of these assumed that only degradation values are recorded, and some assumed failure times to be available. However, most research has been devoted to proposing parameter estimates or to designing degradation tests for a specific sampling method. The differences between these commonly used methods have rarely been investigated. The lack of comparisons between different sampling methods has made it difficult to select an appropriate means by which to collect data. In addition, it remains unclear whether obtaining extra information (eg, exact failure times) is useful for making statistical inferences. In this paper, we assume that the degradation path of a product follows a Wiener degradation process, and we summarize several data collection methods. Maximum likelihood estimates for parameters and their variance‐covariance matrices are derived for each type of data. Several commonly used optimization criteria for designing a degradation test are used to compare estimation efficiency. Sufficient conditions under which one method could be better than the others are proposed. Upper bounds of estimation efficiency are also investigated. Our results provide useful guidelines by which to choose a sampling method, as well as its design variables, to obtain efficient estimation. A simulated example based on real light‐emitting diodes data is studied to verify our theoretical results under a moderate sample size scenario.  相似文献   

17.
Burn‐in is a quality control process used to minimize the warranty cost of a product by screening out defective products through prior operation for a period of time before sale. Two decision criteria used to determine the optimal burn‐in time are the maximization of the reliability of the delivered product and the minimization of the total cost, which are composed of the cost of burn‐in process and the cost of warranty claims. Because of uncertainty regarding the underlying lifetime distribution of the product, both the product reliability and the total cost are random variables. In this paper, the uncertainty in reliability and cost is quantified by use of Bayesian analysis. The joint distribution of reliability and cost is inferred from the uncertainty distribution of the parameters of the product lifetime distribution. To incorporate the uncertainty in reliability and cost as well as the tradeoff between them into the selection of optimal burn‐in time, the joint utility function of reliability and cost is constructed using the joint distribution of reliability and cost. The optimal burn‐in time is selected as the time that maximizes the joint utility function. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

18.
Owing to usage, environment and aging, the condition of a system deteriorates over time. Regular maintenance is often conducted to restore its condition and to prevent failures from occurring. In this kind of a situation, the process is considered to be stable, thus statistical process control charts can be used to monitor the process. The monitoring can help in making a decision on whether further maintenance is worthwhile or whether the system has deteriorated to a state where regular maintenance is no longer effective. When modeling a deteriorating system, lifetime distributions with increasing failure rate are more appropriate. However, for a regularly maintained system, the failure time distribution can be approximated by the exponential distribution with an average failure rate that depends on the maintenance interval. In this paper, we adopt a modification for a time‐between‐events control chart, i.e. the exponential chart for monitoring the failure process of a maintained Weibull distributed system. We study the effect of changes on the scale parameter of the Weibull distribution while the shape parameter remains at the same level on the sensitivity of the exponential chart. This paper illustrates an approach of integrating maintenance decision with statistical process monitoring methods. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

19.
For reliability assessment based on accelerated degradation tests (ADTs), an appropriate parameter estimation method is very important because it affects the extrapolation and prediction accuracy. The well‐adopted maximum likelihood estimation (MLE) method focuses on interpolation fitting and obtains results via maximizing the likelihood of the observations. However, a best interpolation fitting does not necessarily yield a best extrapolation. In this paper, therefore, a pseudo‐MLE (P‐MLE) method is proposed to improve the prediction accuracy of constant‐stress ADTs by considering the degradation mechanism equivalence under Wiener process. In particular, the degradation mechanism equivalence is characterized by a mechanism equivalence factor which presents the proportional relationship between degradation rate and variation. Then, the mechanism equivalence factor is determined via a two‐step method. The other model parameters can be estimated by the general MLE method. The asymptotic variances of acceleration factors and the p‐quantile of product failure time under normal condition are adopted to compare the statistical properties of the proposed method and the general MLE approach. Numerical examples show that the novel P‐MLE method may not achieve a maximum likelihood but can provide more benefits regarding prediction accuracy enhancement especially when the sample size is limited.  相似文献   

20.
Degradation data analysis, which investigates degradation processes of products to extrapolate the lifetime properties, is an effective method for reliability analysis. But degradation data that reflect a product's inherent randomness of degradation are often contaminated by measurement errors. To deal with the problem, this paper proposes a Wiener‐based model with an assumption of logistic distributed measurement errors and adopts the Monte Carlo expectation‐maximization method together with the Gibbs sampling for parameter estimation. Based on the model and parameter estimates, an efficient algorithm is proposed for a quick calculation of maximum likelihood value. Also, the estimation of remaining useful lifetime is discussed. Simulation results show that the proposed model is relatively better and more robust in comparison with the Wiener process with Gaussian noises. Finally, the application of the proposed model is illustrated by an example.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号