首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In the reliability-based design of engineering systems, it is often required to evaluate the failure probability for different values of distribution parameters involved in the specification of design configuration. The failure probability as a function of the distribution parameters is referred as the ‘failure probability function (FPF)’ in this work. From first principles, this problem requires repeated reliability analyses to estimate the failure probability for different distribution parameter values, which is a computationally expensive task. A “weighted approach” is proposed in this work to locally evaluate the FPF efficiently by means of a single simulation. The basic idea is to rewrite the failure probability estimate for a given set of random samples in simulation as a function of the distribution parameters. It is shown that the FPF can be written as a weighted sum of sample values. The latter must be evaluated by system analysis (the most time-consuming task) but they do not depend on the distribution. Direct Monte Carlo simulation, importance sampling and Subset Simulation are incorporated under the proposed approach. Examples are given to illustrate their application.  相似文献   

2.
An efficient strategy to approximate the failure probability function in structural reliability problems is proposed. The failure probability function (FPF) is defined as the failure probability of the structure expressed as a function of the design parameters, which in this study are considered to be distribution parameters of random variables representing uncertain model quantities. The task of determining the FPF is commonly numerically demanding since repeated reliability analyses are required. The proposed strategy is based on the concept of augmented reliability analysis, which only requires a single run of a simulation-based reliability method. This paper introduces a new sample regeneration algorithm that allows to generate the required failure samples of design parameters without any additional evaluation of the structural response. In this way, efficiency is further improved while ensuring high accuracy in the estimation of the FPF. To illustrate the efficiency and effectiveness of the method, case studies involving a turbine disk and an aircraft inner flap are included in this study.  相似文献   

3.
This paper develops a methodology to integrate reliability testing and computational reliability analysis for product development. The presence of information uncertainty such as statistical uncertainty and modeling error is incorporated. The integration of testing and computation leads to a more cost-efficient estimation of failure probability and life distribution than the tests-only approach currently followed by the industry. A Bayesian procedure is proposed to quantify the modeling uncertainty using random parameters, including the uncertainty in mechanical and statistical model selection and the uncertainty in distribution parameters. An adaptive method is developed to determine the number of tests needed to achieve a desired confidence level in the reliability estimates, by combining prior computational prediction and test data. Two kinds of tests — failure probability estimation and life estimation — are considered. The prior distribution and confidence interval of failure probability in both cases are estimated using computational reliability methods, and are updated using the results of tests performed during the product development phase.  相似文献   

4.
Traditionally, reliability based design optimization (RBDO) is formulated as a nested optimization problem. For these problems the objective is to minimize a cost function while satisfying the reliability constraints. The reliability constraints are usually formulated as constraints on the probability of failure corresponding to each of the failure modes or a single constraint on the system probability of failure. The probability of failure is usually estimated by performing a reliability analysis. The difficulty in evaluating reliability constraints comes from the fact that modern reliability analysis methods are themselves formulated as an optimization problem. Solving such nested optimization problems is extremely expensive for large scale multidisciplinary systems which are likewise computationally intensive. In this research, a framework for performing reliability based multidisciplinary design optimization using approximations is developed. Response surface approximations (RSA) of the limit state functions are used to estimate the probability of failure. An outer loop is incorporated to ensure that the approximate RBDO converges to the actual most probable point of failure. The framework is compared with the exact RBDO procedure. In the proposed methodology, RSAs are employed to significantly reduce the computational expense associated with traditional RBDO. The proposed approach is implemented in application to multidisciplinary test problems, and the computational savings and benefits are discussed.  相似文献   

5.
Reliability–sensitivity, which is considered as an essential component in engineering design under uncertainty, is often of critical importance toward understanding the physical systems underlying failure and modifying the design to mitigate and manage risk. This paper presents a new computational tool for predicting reliability (failure probability) and reliability–sensitivity of mechanical or structural systems subject to random uncertainties in loads, material properties, and geometry. The dimension reduction method is applied to compute response moments and their sensitivities with respect to the distribution parameters (e.g., shape and scale parameters, mean, and standard deviation) of basic random variables. Saddlepoint approximations with truncated cumulant generating functions are employed to estimate failure probability, probability density functions, and cumulative distribution functions. The rigorous analytic derivation of the parameter sensitivities of the failure probability with respect to the distribution parameters of basic random variables is derived. Results of six numerical examples involving hypothetical mathematical functions and solid mechanics problems indicate that the proposed approach provides accurate, convergent, and computationally efficient estimates of the failure probability and reliability–sensitivity. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
A novel subset simulation algorithm, called the parallel subset simulation, is proposed to estimate small failure probabilities of multiple limit states with only a single subset simulation analysis. As well known, crude Monte Carlo simulation is inefficient in estimating small probabilities but is applicable to multiple limit states, while the ordinary subset simulation is efficient in estimating small probabilities but can only handle a single limit state. The proposed novel stochastic simulation approach combines the advantages of the two simulation methods: it is not only efficient in estimating small probabilities but also applicable to multiple limit states. The key idea is to introduce a “principal variable” which is correlated with all performance functions. The failure probabilities of all limit states therefore could be evaluated simultaneously when subset simulation algorithm generates the principal variable samples. The statistical properties of the failure probability estimators are also derived. Two examples are presented to demonstrate the effectiveness of the new approach and to compare with crude Monte Carlo and ordinary subset simulation methods.  相似文献   

7.
Reliability-based design of a system often requires the minimization of the probability of system failure over the admissible space for the design variables. For complex systems this probability can rarely be evaluated analytically and so it is often calculated using stochastic simulation techniques, which involve an unavoidable estimation error and significant computational cost. These features make efficient reliability-based optimal design a challenging task. A new method called Stochastic Subset Optimization (SSO) is proposed here for iteratively identifying sub-regions for the optimal design variables within the original design space. An augmented reliability problem is formulated where the design variables are artificially considered as uncertain and Markov Chain Monte Carlo techniques are implemented in order to simulate samples of them that lead to system failure. In each iteration, a set with high likelihood of containing the optimal design parameters is identified using a single reliability analysis. Statistical properties for the identification and stopping criteria for the iterative approach are discussed. For problems that are characterized by small sensitivity around the optimal design choice, a combination of SSO with other optimization algorithms is proposed for enhanced overall efficiency.  相似文献   

8.
In this paper, we model embedded system design and optimization, considering component redundancy and uncertainty in the component reliability estimates. The systems being studied consist of software embedded in associated hardware components. Very often, component reliability values are not known exactly. Therefore, for reliability analysis studies and system optimization, it is meaningful to consider component reliability estimates as random variables with associated estimation uncertainty. In this new research, the system design process is formulated as a multiple-objective optimization problem to maximize an estimate of system reliability, and also, to minimize the variance of the reliability estimate. The two objectives are combined by penalizing the variance for prospective solutions. The two most common fault-tolerant embedded system architectures, N-Version Programming and Recovery Block, are considered as strategies to improve system reliability by providing system redundancy. Four distinct models are presented to demonstrate the proposed optimization techniques with or without redundancy. For many design problems, multiple functionally equivalent software versions have failure correlation even if they have been independently developed. The failure correlation may result from faults in the software specification, faults from a voting algorithm, and/or related faults from any two software versions. Our approach considers this correlation in formulating practical optimization models. Genetic algorithms with a dynamic penalty function are applied in solving this optimization problem, and reasonable and interesting results are obtained and discussed.  相似文献   

9.
The present paper focuses on reliability prediction of composite structure under hygro-thermo-mechanical loading, conditioned by Tsai-Wu failure criterion, where the Monte–Carlo method is used to estimate the failure probability(Pf). This model was developed in two steps: first, the development of a deterministic model, based on an analytical and numerical approach, and then, a probabilistic computation. Using the hoop stress for each ply, a sensitivity analysis was performed for random design variables, such as materials properties, geometry, manufacturing, and loading, on composite cylindrical structure reliability. The probabilistic results show the very high increase of failure probability when all parameters are considered.  相似文献   

10.
Probability of infancy problems for space launch vehicles   总被引:3,自引:1,他引:2  
This paper addresses the treatment of ‘infancy problems’ in the reliability analysis of space launch systems. To that effect, we analyze the probability of failure of launch vehicles in their first five launches. We present methods and results based on a combination of Bayesian probability and frequentist statistics designed to estimate the system's reliability before the realization of a large number of launches. We show that while both approaches are beneficial, the Bayesian method is particularly useful when the experience base is small (i.e. for a new rocket). We define reliability as the probability of success based on a binary failure/no failure event. We conclude that the mean failure rates appear to be higher in the first and second flights (≈1/3 and 1/4, respectively) than in subsequent ones (third, fourth and fifth), and Bayesian methods do suggest that there is indeed some difference in launch risk over the first five launches. Yet, based on a classical frequentist analysis, we find that for these first few flights, the differences in the mean failure rates over successive launches or over successive generations of vehicles, are not statistically significant (i.e. do not meet a 95% confidence level). This is true because the frequentist analysis is based on a fixed confidence level (here: 95%), whereas the Bayesian one allows more flexibility in the conclusions based on a full probability density distribution of the failure rate and therefore, permits better interpretation of the information contained in a small sample. The approach also gives more insight into the considerable uncertainty in failure rate estimates based on small sample sizes.  相似文献   

11.
This article presents a new class of computational methods, known as dimensional decomposition methods, for calculating stochastic sensitivities of mechanical systems with respect to probability distribution parameters. These methods involve a hierarchical decomposition of a multivariate response function in terms of variables with increasing dimensions and score functions associated with probability distribution of a random input. The proposed decomposition facilitates univariate and bivariate approximations of stochastic sensitivity measures, lower-dimensional numerical integrations or Lagrange interpolations, and Monte Carlo simulation. Both the probabilistic response and its sensitivities can be estimated from a single stochastic analysis, without requiring performance function gradients. Numerical results indicate that the decomposition methods developed provide accurate and computationally efficient estimates of sensitivities of statistical moments or reliability, including stochastic design of mechanical systems. Future effort includes extending these decomposition methods to account for the performance function parameters in sensitivity analysis.  相似文献   

12.
An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold. The iteration procedure is a two-step method. On the first iteration, a simple control function promoting failure is constructed using the design point weighting principle. After time discretization, two points are chosen to construct a compound deterministic control function. It is based on the time point when the first maximum of the homogenous solution has occurred and on the point at the end of the considered time interval. An importance sampling technique is used in order to estimate the failure probability functional on a set of initial values of state space variables and time. On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the simple control function. On both iterations, the concept of changing the probability measure by the Girsanov transformation is utilized. As a result the CPU time is substantially reduced compared with the crude Monte Carlo procedure. This paper is dedicated to Prof R N Iyengar of the Indian Institute of Science on the occasion of his formal retirement.  相似文献   

13.
End-of-life tests (EoL-Tests) are typically associated with considerable resources. Accelerated EoL-tests aid at minimizing the required time and budget. Typically, the sample’s failure behavior is described by lifetime models such as the Arrhenius model for only constant stresses. Such models are adapted to the obtained experimental data and are then used to estimate the reliability in the field. Unfortunately, real stress profiles are time-dependent. Despite the effort of parameterizing lifetime models, distribution functions are assumed for the components which ignore the influence of the applied stress. Calculating statistical parameters, such as the reliability inevitably leads to inaccurate results. Furthermore, reliability is often determined based on limited sample sizes. Consequently, reliability prediction is subject to uncertainty and is therefore specified including a confidence interval.This paper presents a new approach which enables the prediction of operative reliability concerning time-dependent stresses with a confidence level. Based on the presented method it is possible to estimate the operative reliability and its confidence interval for transient stresses. This fundamental work uses simulative data to verify the methodology.Two methods for calculating the operative reliability function with time-depending stresses are explained: The cumulative exposure model and the model of age.The most relevant methods to determine a confidence level are introduced briefly. Finally, it is shown how the new method for calculating the operative reliability and the associated confidence intervals for lifetime models is derived. The functionality of the new method, namely the Dubi Bootstrap Simulation (DUBS) is shown by an example enhancing its applicability. The validation of the new approach is done in a separate article using a practical example.  相似文献   

14.
Probabilistic sensitivities provide an important insight in reliability analysis and often crucial towards understanding the physical behaviour underlying failure and modifying the design to mitigate and manage risk. This article presents a new computational approach for calculating stochastic sensitivities of mechanical systems with respect to distribution parameters of random variables. The method involves high dimensional model representation and score functions associated with probability distribution of a random input. The proposed approach facilitates first-and second-order approximation of stochastic sensitivity measures and statistical simulation. The formulation is general such that any simulation method can be used for the computation such as Monte Carlo, importance sampling, Latin hypercube, etc. Both the probabilistic response and its sensitivities can be estimated from a single probabilistic analysis, without requiring gradients of performance function. Numerical results indicate that the proposed method provides accurate and computationally efficient estimates of sensitivities of statistical moments or reliability of structural system.  相似文献   

15.
This paper describes a method for estimating and forecasting reliability from attribute data, using the binomial model, when reliability requirements are very high and test data are limited. Integer data—specifically, numbers of failures — are converted into non-integer data. The rationale is that when engineering corrective action for a failure is implemented, the probability of recurrence of that failure is reduced; therefore, such failures should not be carried as full failures in subsequent reliability estimates. The reduced failure value for each failure mode is the upper limit on the probability of failure based on the number of successes after engineering corrective action has been implemented. Each failure value is less than one and diminishes as test programme successes continue. These numbers replace the integral numbers (of failures) in the binomial estimate. This method of reliability estimation was applied to attribute data from the life history of a previously tested system, and a reliability growth equation was fitted. It was then ‘calibrated’ for a current similar system's ultimate reliability requirements to provide a model for reliability growth over its entire life-cycle. By comparing current estimates of reliability with the expected value computed from the model, the forecast was obtained by extrapolation.  相似文献   

16.
In this research, a new method is proposed to update real-time reliability based on data recorded by instruments and sensors installed on a system. The method is founded on Bayesian analysis and subset simulation and is capable of estimating the functional relationship between the real-time failure probability and the monitoring value. It is shown that as long as the monitoring data can be reasonably deduced into a single index, this relationship can be obtained; moreover, it can be obtained prior to the monitoring process. Three examples of civil engineering systems are used to demonstrate the new method. This new method may be applied to safety monitoring of in-construction civil systems and monitoring of existing civil systems.  相似文献   

17.
A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations.In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level.  相似文献   

18.
Reliability-based robust design optimization (RBRDO) is a crucial tool for life-cycle quality improvement. Gaussian process (GP) model is an effective alternative modeling technique that is widely used in robust parameter design. However, there are few studies to deal with reliability-based design problems by using GP model. This article proposes a novel life-cycle RBRDO approach concerning response uncertainty under the framework of GP modeling technique. First, the hyperparameters of GP model are estimated by using the Gibbs sampling procedure. Second, the expected partial derivative expression is derived based on GP modeling technique. Moreover, a novel failure risk cost function is constructed to assess the life-cycle reliability. Then, the quality loss function and confidence interval are constructed by simulated outputs to evaluate the robustness of optimal settings and response uncertainty, respectively. Finally, an optimization model integrating failure risk cost function, quality loss function, and confidence interval analysis approach is constructed to find reasonable optimal input settings. Two case studies are given to illustrate the performance of the proposed approach. The results show that the proposed approach can make better trade-offs between the quality characteristics and reliability requirements by considering response uncertainty.  相似文献   

19.
Despite many advances in the field of computational system reliability analysis, estimating the joint probability distribution of correlated non-normal state variables on the basis of incomplete statistical data brings great challenges for engineers. To avoid multidimensional integration, system reliability estimation usually requires the calculation of marginal failure probability and joint failure probability. The current article proposed an integrated approach for estimating system reliability on the basis of the high moment method, saddle point approximation, and copulas. First, the statistic moment estimation based on the stochastic perturbation theory is presented. Thereafter, by constructing CGF (concise cumulant generating function) for the state variable with its first four statistical moments, a fourth moment saddle point approximation method is established for the component reliability estimation. Second, the copula theory is briefly introduced and extensively utilized two-dimensional copulas are presented. The best fit copula for estimating the probability of system failure is selected according to the AIC (Akaike Information Criterion). Finally, the derived method is applied to three numerical examples for the sake of a comprehensive validation.  相似文献   

20.
In this paper, a new computational framework based on the topology derivative concept is presented for evaluating stochastic topological sensitivities of complex systems. The proposed framework, designed for dealing with high dimensional random inputs, dovetails a polynomial dimensional decomposition (PDD) of multivariate stochastic response functions and deterministic topology derivatives. On one hand, it provides analytical expressions to calculate topology sensitivities of the first three stochastic moments which are often required in robust topology optimization (RTO). On another hand, it offers embedded Monte Carlo Simulation (MCS) and finite difference formulations to estimate topology sensitivities of failure probability for reliability-based topology optimization (RBTO). For both cases, the quantification of uncertainties and their topology sensitivities are determined concurrently from a single stochastic analysis. Moreover, an original example of two random variables is developed for the first time to obtain analytical solutions for topology sensitivity of moments and failure probability. Another 53-dimension example is constructed for analytical solutions of topology sensitivity of moments and semi-analytical solutions of topology sensitivity of failure probabilities in order to verify the accuracy and efficiency of the proposed method for high-dimensional scenarios. Those examples are new and make it possible for researchers to benchmark stochastic topology sensitivities of existing or new algorithms. In addition, it is unveiled that under certain conditions the proposed method achieves better accuracies for stochastic topology sensitivities than for the stochastic quantities themselves.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号