首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In the analysis of accelerated life testing (ALT) data, some stress‐life model is typically used to relate results obtained at stressed conditions to those at use condition. For example, the Arrhenius model has been widely used for accelerated testing involving high temperature. Motivated by the fact that some prior knowledge of particular model parameters is usually available, this paper proposes a sequential constant‐stress ALT scheme and its Bayesian inference. Under this scheme, test at the highest stress is firstly conducted to quickly generate failures. Then, using the proposed Bayesian inference method, information obtained at the highest stress is used to construct prior distributions for data analysis at lower stress levels. In this paper, two frameworks of the Bayesian inference method are presented, namely, the all‐at‐one prior distribution construction and the full sequential prior distribution construction. Assuming Weibull failure times, we (1) derive the closed‐form expression for estimating the smallest extreme value location parameter at each stress level, (2) compare the performance of the proposed Bayesian inference with that of MLE by simulations, and (3) assess the risk of including empirical engineering knowledge into ALT data analysis under the proposed framework. Step‐by‐step illustrations of both frameworks are presented using a real‐life ALT data set. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

2.
Reliability growth tests are often used for achieving a target reliability for complex systems via multiple test‐fix stages with limited testing resources. Such tests can be sped up via accelerated life testing (ALT) where test units are exposed to harsher‐than‐normal conditions. In this paper, a Bayesian framework is proposed to analyze ALT data in reliability growth. In particular, a complex system with components that have multiple competing failure modes is considered, and the time to failure of each failure mode is assumed to follow a Weibull distribution. We also assume that the accelerated condition has a fixed time scaling effect on each of the failure modes. In addition, a corrective action with fixed ineffectiveness can be performed at the end of each stage to reduce the occurrence of each failure mode. Under the Bayesian framework, a general model is developed to handle uncertainty on all model parameters, and several special cases with some parameters being known are also studied. A simulation study is conducted to assess the performance of the proposed models in estimating the final reliability of the system and to study the effects of unbiased and biased prior knowledge on the system‐level reliability estimates.  相似文献   

3.
This paper develops a methodology to integrate reliability testing and computational reliability analysis for product development. The presence of information uncertainty such as statistical uncertainty and modeling error is incorporated. The integration of testing and computation leads to a more cost-efficient estimation of failure probability and life distribution than the tests-only approach currently followed by the industry. A Bayesian procedure is proposed to quantify the modeling uncertainty using random parameters, including the uncertainty in mechanical and statistical model selection and the uncertainty in distribution parameters. An adaptive method is developed to determine the number of tests needed to achieve a desired confidence level in the reliability estimates, by combining prior computational prediction and test data. Two kinds of tests — failure probability estimation and life estimation — are considered. The prior distribution and confidence interval of failure probability in both cases are estimated using computational reliability methods, and are updated using the results of tests performed during the product development phase.  相似文献   

4.
In this paper we present an approach for the Bayesian estimation of piecewise constant failure rates under the constraint that the constant value of the failure rate in an interval of time is greater than a function of its values in the prior intervals. We apply this approach to the estimation of piecewise constant failure rates for conditional IFR, IFRA and NBU distributions. The prior distribution for the failure rate in each interval is specified through gamma distributions with functions of the failure rate values corresponding to the rest of the intervals as location parameters. Using this approach the prior distribution parameters have interpretations through prior means and variances of the values of the piecewise constant failure rate. The posterior distributions and expected values can be found in terms of gamma functions, without the necessity of numerical integrations. We apply this approach to a model for reliability estimation when two operational modes exists and the number of failures in each operational mode is unknown. Finally a numerical example is presented in which simulations of posterior densities are carried out.  相似文献   

5.
Degradation modeling might be an alternative to the conventional life test in reliability assessment for high quality products. This paper develops a Bayesian approach to the step‐stress accelerated degradation test. Reliability inference of the population is made based on the posterior distribution of the underlying parameters with the aid of Markov chain Monte Carlo method. Further sequential reliability inference on individual product under normal condition is also proposed. Simulation study and an illustrative example are presented to show the appropriateness of the proposed method. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

6.
In many situations, we want to accept or reject a population with small or finite population size. In this paper, we will describe Bayesian and non‐Bayesian approaches for the reliability demonstration test based on the samples from a finite population. The Bayesian method is an approach that combines prior experience with newer test data in the application of statistical tools for reliability quantification. When test time and/or sample quantity is limited, the Bayesian approach should be considered. In this paper, a non‐Bayesian reliability demonstration test is considered for both finite and large population cases. The Bayesian approach with ‘uniform’ prior distributions, Polya prior distributions, and sequential sampling is also presented. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

7.
This article presents the development of a general Bayes inference model for accelerated life testing. The failure times at a constant stress level are assumed to belong to a Weibull distribution, but the specification of strict adherence to a parametric time-transformation function is not required. Rather, prior information is used to indirectly define a multivariate prior distribution for the scale parameters at the various stress levels and the common shape parameter. Using the approach, Bayes point estimates as well as probability statements for use-stress (and accelerated) life parameters may be inferred from a host of testing scenarios. The inference procedure accommodates both the interval data sampling strategy and type I censored sampling strategy for the collection of ALT test data. The inference procedure uses the well-known MCMC (Markov Chain Monte Carlo) methods to derive posterior approximations. The approach is illustrated with an example.  相似文献   

8.
In this paper, a Cox proportional hazard model with error effect applied on the study of an accelerated life test is investigated. Statistical inference under Bayesian methods by using the Markov chain Monte Carlo techniques is performed in order to estimate the parameters involved in the model and predict reliability in an accelerated life testing. The proposed model is applied to the analysis of the knock sensor failure time data in which some observations in the data are censored. The failure times at a constant stress level are assumed to be from a Weibull distribution. The analysis of the failure time data from an accelerated life test is used for the posterior estimation of parameters and prediction of the reliability function as well as the comparisons with the classical results from the maximum likelihood estimation. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

9.
Usually, for high reliability products the production cost is high and the lifetime is much longer, which may not be observable within a limited time. In this paper, an accelerated experiment is employed in which the lifetime follows an exponential distribution with the failure rate being related to the accelerated factor exponentially. The underlying parameters are also assumed to have the exponential prior distributions. A Bayesian zero‐failure reliability demonstration test is conducted to design forehand the minimum sample size and testing length subject to a certain specified reliability criterion. Probability of passing the test design as well as predictive probability for additional experiments is also derived. Sensitivity analysis of the design is investigated by a simulation study. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

10.
Some life tests are terminated with few or no failures. In such cases, a recent approach is to obtain degradation measurements of product performance that may contain some useful information about product reliability. Generally degradation paths of products are modeled by a nonlinear regression model with random coefficients. If we can obtain the estimates of parameters under the model, then the time‐to‐failure distribution can be estimated. In some cases, the patterns of a few degradation paths are different from those of most degradation paths in a test. Therefore, this study develops a weighted method based on fuzzy clustering procedure to robust estimation of the underlying parameters and time‐to‐failure distribution. The method will be studied on a real data set. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

11.
Precisely predicting the remaining life for an individual plays an important role in condition‐based maintenance, so Bayesian inference method, which can integrate useful data from several sources to improve the prediction accuracy, has became a research hot. Aiming at the situation that accelerated degradation tests have been widely applied to assess the reliability of products, a remaining life prediction method based on Bayesian inference by taking accelerated degradation data as prior information is proposed. A Wiener process with random drift, diffusion parameters is used to model degradation data, and conjugate prior distributions of random parameters are adopted. To solve the problem that it is hard to estimate the hyper parameters from accelerated degradation data using an Expectation Maximization algorithm, a data extrapolation method is developed. With acceleration factors, degradation data are extrapolated from accelerated stress levels to the normal use stress level. Acceleration factor constant hypothesis is used to deduce the expression of acceleration factor for a Wiener degradation model. Besides, simulation tests are designed to validate the proposed method. The method of constructing the confidence levels for the remaining life predictions is also provided. Finally, a case study is used to illustrate the application of our developed method. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

12.
In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.  相似文献   

13.
In this paper, we present the concept of a novel control chart, which uses economic considerations within the real options–inspired framework together with the principles of Bayesian statistics to produce a continuously updated estimate of the parameters of the actual process, and thus to decide whether to continue running the process or to recalibrate it instead. Bayesian estimate allows the decision maker to combine prior information about the process with the continuously incoming data in a natural flexible manner. In the real options framework, at any given moment, we compare the cost of recalibrating the process to the cost of postponing the (optimal) decision for later. The decision is thus based on cost‐benefit analysis rather than statistically significant deviations from the in‐control process. To have a clear focus on the conceptual representation of the novel methodology, we consider a continuously sampled binary process. We derive the algorithm for the control chart, which can also, in this discrete setting, be represented as a table, a matrix, or a tree. We also investigate the performance of the method in different settings with particular attention being paid to the role of Bayesian prior. Being flexible in prior beliefs leads to better results anywhere outside of the in‐control process. Together, Bayesian paradigm and dynamic decision‐making approach create a realistic representation of a real‐life decision‐making process.  相似文献   

14.
A Bayesian life test sampling plan is considered for products with Weibull lifetime distribution which are sold under a warranty policy. It is assumed that the shape parameter of the distribution is a known constant, but the scale parameter is a random variable varying from lot to lot according to a known prior distribution. A cost model is constructed which involves three cost components; test cost, accept cost, and reject cost. A method of finding optimal sampling plans which minimize the expected average cost per lot is presented and sensitivity analyses for the parameters of the lifetime and prior distributions are performed.  相似文献   

15.
Control chart could effectively reflect whether a manufacturing process is currently under control or not. The calculation of control limits of the control chart has been focusing on traditional frequency approach, which requires a large sample size for an accurate estimation. A conjugate Bayesian approach is introduced to correct the calculation error of control limits with traditional frequency approach in multi-batch and low volume production. Bartlett’s test, analysis of variance test and standardisation treatment are used to construct a proper prior distribution in order to calculate the Bayes estimators of process distribution parameters for the control limits. The case study indicates that this conjugate Bayesian approach presents better performance than the traditional frequency approach when the sample size is small.  相似文献   

16.
We formulate and evaluate a Bayesian approach to probabilistic input modeling for simulation experiments that accounts for the parameter and stochastic uncertainties inherent in most simulations and that yields valid predictive inferences about outputs of interest. We use prior information to construct prior distributions on the parameters of the input processes driving the simulation. Using Bayes' rule, we combine this prior information with the likelihood function of sample data observed on the input processes to compute the posterior parameter distributions. In our Bayesian simulation replication algorithm, we estimate parameter uncertainty by independently sampling new values of the input-model parameters from their posterior distributions on selected simulation runs; and we estimate stochastic uncertainty by performing multiple (conditionally) independent runs with each set of parameter values. We formulate performance measures relevant to both Bayesian and frequentist input-modeling techniques, and we summarize an experimental performance evaluation demonstrating the advantages of the Bayesian approach.  相似文献   

17.
Variable-stress accelerated life testing trials are experiments in which each of the units in a random sample of units of a product is run under increasingly severe conditions to get information quickly on its life distribution. We consider a fatigue failure model in which accumulated decay is governed by a continuous Gaussian process W(y) whose distribution changes at certain stress change points to < t l < < … <t k , Continuously increasing stress is also considered. Failure occurs the first time W(y) crosses a critical boundary ω. The distribution of time to failure for the models can be represented in terms of time-transformed inverse Gaussian distribution functions, and the parameters in models for experiments with censored data can be estimated using maximum likelihood methods. A common approach to the modeling of failure times for experimental units subject to increased stress at certain stress change points is to assume that the failure times follow a distribution that consists of segments of Weibull distributions with the same shape parameter. Our Wiener-process approach gives an alternative flexible class of time-transformed inverse Gaussian models in which time to failure is modeled in terms of accumulated decay reaching a critical level and in which parametric functions are used to express how higher stresses accelerate the rate of decay and the time to failure. Key parameters such as mean life under normal stress, quantiles of the normal stress distribution, and decay rate under normal and accelerated stress appear naturally in the model. A variety of possible parameterizations of the decay rate leads to flexible modeling. Model fit can be checked by percentage-percentage plots.  相似文献   

18.
In many industrial applications, it is not always feasible to continuously monitor the life testing experiments to collect lifetime data. Moreover, intermediate removals of the test units from the life testing experiment are sometimes essential. Progressive Type‐I interval censoring schemes are useful in these scenarios. Optimal planning of such progressive Type‐I interval censoring schemes is an important issue to the experimenter, as the optimal plans can achieve the desired objectives using much lesser resources. This article provides Bayesian D‐optimal progressive Type‐I interval censoring schemes, assuming that the lifetime follows a log‐normal distribution. An algorithm is provided to find the optimal censoring schemes and the number of inspections. The algorithm is then used to obtain the optimal Bayesian progressive Type‐I interval censoring schemes in 2 different contexts. The resulting optimal Bayesian censoring schemes are compared with the corresponding locally optimal censoring schemes. A detailed sensitivity analysis is performed to investigate the effect of prior information. The sampling variation associated with the optimal censoring schemes is visualized through a simulation study.  相似文献   

19.
Maximum likelihood estimation (MLE) is a frequently used method for estimating distribution parameters in constant stress partially accelerated life tests (CS‐PALTs). However, using the MLE to estimate the parameters for a Weibull distribution may be problematic in CS‐PALTs. First, the equation for the shape parameter estimator derived from the log‐likelihood function is difficult to solve for the occurrence of nonlinear equations. Second, the sample size is typically not large in life tests. The MLE, a typical large‐sample inference method, may be unsuitable. Test items unsuitable for stress conditions may become early failures, which have extremely short lifetimes. The early failures may cause parameter estimate bias. For addressing early failures in the Weibull distribution in CS‐PALTs, we propose an M‐estimation method based on a Weibull Probability Plot (WPP) framework, which leads a closed‐form expression for the shape parameter estimator. We conducted a simulation study to compare the M‐estimation method with the MLE method. The results show that, with early‐failure samples, the M‐estimation method performs better than the MLE does. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

20.
Quantification of the enhancement in cleavage fracture toughness of ferritic steels following warm pre‐stressing has received great interest in light of its significance in the integrity assessment of such structures as pressure vessels. A Beremin type probability distribution model, i.e., a local stress‐based approach to cleavage fracture, has been developed and used for estimating cleavage fracture following prior loading (or warm pre‐stressing, WPS) in two ferritic steels with different geometry configurations. Firstly, the Weibull parameters required to match the experimental scatter in lower shelf toughness of the candidate steels are identified. These parameters are then used in two‐ and three‐dimensional finite element simulations of prior loading on the upper shelf followed by unloading and cooling to lower shelf temperatures (WPS) to determine the probability of failure. Using both isotropic hardening and kinematic hardening material models, the effect of hardening response on the predictions obtained from the suggested approach has been examined. The predictions are consistent with experimental scatter in toughness following WPS and provide a means of determining the importance of the crack tip residual stresses. We demonstrate that for our steels the crack tip residual stress is the pivotal feature in improving the fracture toughness following WPS. Predictions are compared with the available experimental data. The paper finally discusses the results in the context of the non‐uniqueness of the Weibull parameters and investigates the sensitivity of predictions to the Weibull exponent, m, and the relevance of m to the stress triaxiality factor as suggested in the literature.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号