首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
We present two methods for the estimation of main effects in global sensitivity analysis. The methods adopt Satterthwaite's application of random balance designs in regression problems, and extend it to sensitivity analysis of model output for non-linear, non-additive models. Finite as well as infinite ranges for model input factors are allowed. The methods are easier to implement than any other method available for global sensitivity analysis, and reduce significantly the computational cost of the analysis. We test their performance on different test cases, including an international benchmark on safety assessment for nuclear waste disposal originally carried out by OECD/NEA.  相似文献   

2.
Uncertainty is very important in risk analysis. A natural way to describe this uncertainty is to describe a set of possible values of each unknown quantity (this set is usually an interval), plus any additional information that we may have about the probability of different values within this set. Traditional statistical techniques deal with the situations in which we have a complete information about the probabilities; in real life, however, we often have only partial information about them. We therefore need to describe methods of handling such partial information in risk analysis. Several such techniques have been presented, often on a heuristic basis. The main goal of this paper is to provide a justification for a general formalism for handling different types of uncertainty, and to describe a new black-box technique for processing this type of uncertainty.  相似文献   

3.
For real engineering systems, it is sometimes difficult to obtain sufficient data to estimate the precise values of some parameters in reliability analysis. This kind of uncertainty is called epistemic uncertainty. Because of the epistemic uncertainty, traditional universal generating function (UGF) technique is not appropriate to analyze the reliability of systems with performance sharing mechanism under epistemic uncertainty. This paper proposes a belief UGF (BUGF)‐based method to evaluate the reliability of multi‐state series systems with performance sharing mechanism under epistemic uncertainty. The proposed BUGF‐based reliability analysis method is validated by an illustrative example and compared with the interval UGF (IUGF)‐based methods with interval arithmetic or affine arithmetic. The illustrative example shows that the proposed BUGF‐based method is more efficient than the IUGF‐based methods in the reliability analysis of multi‐state systems (MSSs) with performance sharing mechanism under epistemic uncertainty.  相似文献   

4.
The current challenge of nuclear weapon stockpile certification is to assess the reliability of complex, high-consequent, and aging systems without the benefit of full-system test data. In the absence of full-system testing, disparate kinds of information are used to inform certification assessments such as archival data, experimental data on partial systems, data on related or similar systems, computer models and simulations, and expert knowledge. In some instances, data can be scarce and information incomplete. The challenge of Quantification of Margins and Uncertainties (QMU) is to develop a methodology to support decision-making in this informational context. Given the difficulty presented by mixed and incomplete information, we contend that the uncertainty representation for the QMU methodology should be expanded to include more general characterizations that reflect imperfect information. One type of generalized uncertainty representation, known as probability bounds analysis, constitutes the union of probability theory and interval analysis where a class of distributions is defined by two bounding distributions. This has the advantage of rigorously bounding the uncertainty when inputs are imperfectly known. We argue for the inclusion of probability bounds analysis as one of many tools that are relevant for QMU and demonstrate its usefulness as compared to other methods in a reliability example with imperfect input information.  相似文献   

5.
This paper addresses the concept of model uncertainty within the context of risk analysis. Though model uncertainty is a topic widely discussed in the risk analysis literature, no consensus seems to exist on its meaning, how it should be measured, or its impact on the application of analysis results in decision processes. The purpose of this paper is to contribute to clarification. The first parts of the paper look into the contents of the two terms ‘model’ and ‘uncertainty’. On this platform it is discussed how focus on model uncertainty merely leads to muddling up the message of the analysis, if risk is interpreted as a true, inherent property of the system, to be estimated in the risk analysis. An alternative approach is to see the models as means for expressing uncertainty regarding the system performance. In this case, it is argued, the term ‘model uncertainty’ loses its meaning.  相似文献   

6.
Three applications of sampling-based sensitivity analysis in conjunction with evidence theory representations for epistemic uncertainty in model inputs are described: (i) an initial exploratory analysis to assess model behavior and provide insights for additional analysis; (ii) a stepwise analysis showing the incremental effects of uncertain variables on complementary cumulative belief functions and complementary cumulative plausibility functions; and (iii) a summary analysis showing a spectrum of variance-based sensitivity analysis results that derive from probability spaces that are consistent with the evidence space under consideration.  相似文献   

7.
In the present work, an approximation method was used to determine both the crystallite size and microstrain from XRD profile of TiSiN thin film deposited on high speed steel substrates. The estimated crystallite size obtained via this approximation method was in good agreement with the resulting microstructure observation using the scanning electron microscope (SEM). The approximation method was used to determine microstrain, and its corresponding compressive stress was related to the result of scratch adhesion measurement of the TiSiN thin film. Comparison of crystallite size and microstrain were investigated using different definitions of line broadening, β. The approximation method was found to be useful in cases when crystallite size and microstrain contributed in the line broadening simultaneously. This research demonstrated the reliability of using the approximation method in determining the resulting crystallite size and microstrain from the XRD line broadening analysis in the TiSiN thin films.  相似文献   

8.
A well-designed and operated industrial ecological system should be able to utilize effectively the generated wastes from one member as the feed to another member. Nevertheless, due to heavy interactions among the member entities, particularly with various uncertainties, the coordinative material and energy reuse is a very complex task. In this paper, the issues of optimal operation of an industrial ecosystem under uncertainty are addressed. A game theory based approach is then introduced to derive an economically and environmentally optimal status of an industrial ecosystem. The effectiveness of the approach is demonstrated by tackling a case study problem, where the Nash Equilibrium for the profit payoff and sustainability payoff of the member entities is identified. The possible conflicts of the profit and sustainability objectives of the member entities in the ecosystem are resolved.  相似文献   

9.
Shaojun Xie  Xiaoping Du 《工程优选》2013,45(8):1125-1139
Reliability analysis may involve random variables and interval variables. In addition, some of the random variables may have interval distribution parameters owing to limited information. This kind of uncertainty is called second order uncertainty. This article develops an efficient reliability method for problems involving the three aforementioned types of uncertain input variables. The analysis produces the maximum and minimum reliability and is computationally demanding because two loops are needed: a reliability analysis loop with respect to random variables and an interval analysis loop for extreme responses with respect to interval variables. The first order reliability method and nonlinear optimization are used for the two loops, respectively. For computational efficiency, the two loops are combined into a single loop by treating the Karush–Kuhn–Tucker (KKT) optimal conditions of the interval analysis as constraints. Three examples are presented to demonstrate the proposed method.  相似文献   

10.
Risk analysis is a tool for investigating and reducing uncertainty related to outcomes of future activities. Probabilities are key elements in risk analysis, but confusion about interpretation and use of probabilities often weakens the message from the analyses. Under the predictive, epistemic approach to risk analysis, probabilities are used to express uncertainty related to future values of observable quantities like the number of fatalities or monetary loss in a period of time. The procedure for quantifying this uncertainty in terms of probabilities is, however, not obvious. Examples of topics from the literature relevant in this discussion are use of expert judgement, the effect of so-called heuristics and biases, application of historical data, dependency and updating of probabilities. The purpose of this paper is to discuss and give guidelines on how to quantify uncertainty in the perspective of these topics. Emphasis is on the use of models and assessment of uncertainties of similar quantities.  相似文献   

11.
A number of investigators have pointed out that products and processes lack quality because of performance inconsistency, which is often due to uncontrollable parameters in the manufacturing process or product usage. Robust design methods are aimed at finding product/process designs that are less sensitive to parameter variation. Robust design of computer simulations requires a large number of runs, which are very time consuming. A novel methodology for robust design is presented in this article. It integrates an iterative heuristic optimization method with uncertainty analysis to achieve effective variability reductions, exploring a large parameter domain with an accessible number of simulations. To demonstrate the effectiveness of this methodology, the robust design of a 0.15 μm CMOS device is shown.  相似文献   

12.
The potential danger posed to human health from pesticides and herbicides has been a growing national concern due to the increased frequency of agrochemical residues found in food and water. It is becoming critical to determine the concentration in all environmental media for a complete picture of potential human exposure. A multimedia transport model is used to determine the concentration of atrazine in surface water, ground water, surface soil, root zone soil, plants, and air at a typical mid-western location. A range of values is used for each model input, resulting in a distribution of possible concentrations in each medium. A sensitivity analysis determines the influence each parameter has on the outcome variance for each environmental media concentration. The concentrations determined for ground and surface water are then compared to measured concentrations in the region to validate the model. The concentrations are then compared to measured concentrations in the region to validate the model. A companion paper utilizes these concentrations and translates them into human exposure and risk.  相似文献   

13.
Uncertainty and sensitivity analysis results obtained with random and Latin hypercube sampling are compared. The comparison uses results from a model for two-phase fluid flow obtained with three independent random samples of size 100 each and three independent Latin hypercube samples (LHSs) of size 100 each. Uncertainty and sensitivity analysis results with the two sampling procedures are similar and stable across the three replicated samples. Poor performance of regression-based sensitivity analysis procedures for some analysis outcomes results more from the inappropriateness of the procedure for the nonlinear relationships between model input and model results than from an inadequate sample size. Kendall's coefficient of concordance (KCC) and the top down coefficient of concordance (TDCC) are used to assess the stability of sensitivity analysis results across replicated samples, with the TDCC providing a more informative measure of analysis stability than KCC. A new sensitivity analysis procedure based on replicated samples and the TDCC is introduced.  相似文献   

14.
This paper discusses a type of redundancy that is typical in a multi-state system. It considers two interconnected multi-state systems where one multi-state system can satisfy its own stochastic demand and also can provide abundant resource (performance) to another system in order to improve the assisted system reliability. Traditional methods are usually not effective enough for reliability analysis for such multi-state systems because of the “dimensional curse” problem. This paper presents a new method for reliability evaluation for the repairable multi-state system considering such kind of redundancy. The proposed method is based on the combination of the universal generating function technique and random processes methods. The numerical example is presented to illustrate the proposed method.  相似文献   

15.
The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) performs an annual review of computer models that have been submitted by vendors for use in insurance rate filling in Florida. As part of the review process and to comply with the Sunshine Law, the FCHLPM employs a Professional Team to perform onsite (confidential) audits of these models. Members of the Professional Team represent the fields of actuarial science, computer science, meteorology, statistics and wind and structural engineering. The audit includes an assessment of modeler's compliance to a set of standards and modules established by the FCHLPM. One part of these standards requires the conduct of uncertainty and sensitivity analyses to the proprietary model. At the completion of the audit, the professional team provides a written report to the FCHLPM, who ultimately judges compliance by a vendor to the standards. To influence future such analyses, the Professional Team conducted a demonstration uncertainty and sensitivity analysis for the FCHLPM using a Rankine-vortex hurricane wind field model and surrogate damage function. This is the first of a two-part article presenting the results of those analyses. Part 1 presents sensitivity analysis results for wind speed and loss cost, while Part 2 presents the corresponding uncertainty analysis results.  相似文献   

16.
The least-squares analysis of data with error in x and y is generally thought to yield best results when the quantity minimized is the sum of the properly weighted squared residuals in x and in y. As an alternative to this “total variance” (TV) method, “effective variance” (EV) methods convert the uncertainty in x into an effective contribution to that in y, and though easier to use are considered to be less reliable. There are at least two EV methods, differing in how the weights are treated in the optimization. One of these is identical to the TV method for fits to a straight line. The formal differences among these methods are clarified, and Monte Carlo simulations are used to examine the statistical properties of each on the widely used straight-line model of York, a quadratic variation on this, Orear's hyperbolic model, a nonlinear binding (Langmuir) model, and Wentworth's kinetics model. The simulations confirm that the EV and TV methods are statistically equivalent in the limit of small data error, where they yield unbiased, normally distributed parameter estimates, with standard errors correctly predicted by the a priori covariance matrix. With increasing data error, these properties fail to hold; and the TV method is not always statistically best. Nonetheless, the method differences should seldom be of practical significance, since they are likely to be small compared with uncertainties from incomplete information about the data error in x and y.  相似文献   

17.
Failure mode and effects analysis (FMEA) is a prospective risk assessment tool used to identify, assess, and eliminate potential failure modes (FMs) in various industries to improve security and reliability. However, the traditional FMEA method has been criticized for several shortcomings and even the improved FMEA methods based on predefined linguistic terms cannot meet the needs of FMEA team members' diversified opinion expressions. To solve these problems, a novel FMEA method is proposed by integrating Bayesian fuzzy assessment number (BFAN) and extended gray relational analysis‐technique for order preference by similarity to ideal solution (GRA‐TOPSIS) method. First, the BFANs are used to flexibly describe the risk evaluation results of the identified failure modes. Second, the Hausdorff distance between BFANs is calculated by using the probability density function (PDF). Finally, on the basis of the distance, the extended GRA‐TOPSIS method is applied to prioritize failure modes. A simulation study is presented to verify the effectiveness of the proposed approach in dealing with vague concepts and show its advantages over existing FMEA methods. Furthermore, a real case concerning the risk evaluation of aero‐engine turbine and compressor blades is provided to illustrate the practical application of the proposed method and particularly show the potential of using the BFANs in capturing FMEA team members' diverse opinions.  相似文献   

18.
A comprehensive analysis of size and strain broadened profile shapes in X-ray diffraction line broadening analysis is presented. Both size and strain broadened profiles were assumed to be Voigtian and the derived microstructural parameters (size and strain) were found to be in close agreement with those calculated from model independent Warren-Averbach method. The method is applied to three different alumina samples viz. micron size α-alumina (α-Al2O3) prepared by the combustion of aluminium nitrate and urea mixture, annealed samples and commercial α-Al2O3 sample. It is likely from the present analysis that a significant Gaussian size contribution is related to narrow size distribution observed from the analysis. It has been concluded that present Voigtian analysis is more reliable and may largely replace the earlier simplified integral breadth methods of analysis often used in line broadening analysis.  相似文献   

19.
Following previous results showing that under static loads it is possible to detect the first plasticization of the specimens at the end of the thermoelastic phase, the authors have conducted experimental trials to verify that this effect can be pointed out in notched and unnotched polyvinyl chloride (PVC) specimens. The goal is to define the real elastic phase also for materials for which the elastic limit and the yield stress are not easily defined, different from the case of steel. The results show a variable thermal behaviour depending on the distance from the notch. The thermal behaviour, proportional to the stress in the totally elastic phase, accordingly with the thermoelastic effect, deviating from the linearity, points out the beginning of the local plasticization. The thermoelastic limit, moving from the notch edge to the specimen boundary, allows to follow the paths of plasticization. The results are also compared with those found by cyclic loading using the thermographic methodology already verified by the authors and other researchers.  相似文献   

20.
This paper presents buckling analysis of a two-dimensional functionally graded cylindrical shell reinforced by axial stiffeners (stringer) under combined compressive axial and transverse uniform distributive load. The shell material properties are graded in the direction of thickness and length according to a simple power law distribution in terms of the volume fractions of the constituents. Primarily, the third order shear deformation theory (TSDT) is used to derive the equilibrium and stability equations. Since there is no closed form solution, the numerical differential quadrature method, (DQM), is applied for solving the stability equations. Initially, the obtained results for an isotropic shell using DQM were verified against those given in the literature for simply supported boundary conditions. The effects of load, geometrical and stringer parameters along with FG power index in the various boundary conditions on the critical buckling load have been studied. The study of results confirms that, stringers have significant effects on critical buckling load.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号