首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 625 毫秒
1.
The rise of Performance Based Design methodologies for fire safety engineering has increased the interest of the fire safety community in the concepts of risk and reliability. Practical applications have however been severely hampered by the lack of an efficient unbiased calculation methodology. This is because on the one hand, the distribution types of model output variables in fire safety engineering are not known and traditional distribution types as for example the normal and lognormal distribution may result in unsafe approximations. Therefore unbiased methods must be applied which make no (implicit) assumptions on the PDF type. Traditionally these unbiased methods are based on Monte Carlo simulations. On the other hand, Monte Carlo simulations require a large number of model evaluations and are therefore too computationally expensive when large and nonlinear calculation models are applied, as is common in fire safety engineering. The methodology presented in this paper avoids this deadlock by making an unbiased estimate of the PDF based on only a very limited number of model evaluations. The methodology is known as the Maximum Entropy Multiplicative Dimensional Reduction Method (ME-MDRM) and results in a mathematical formula for the probability density function (PDF) describing the uncertain output variable. The method can be applied with existing models and calculation tools and allows for a parallelization of model evaluations. The example applications given in the paper stem from the field of structural fire safety and illustrate the excellent performance of the method for probabilistic structural fire safety engineering. The ME-MDRM can however be considered applicable to other types of engineering models as well.  相似文献   

2.
Radiation heat transfer has been found to dominate fire spread in large-scale fires. The radiation heat loss of buoyant turbulent fires is coupled with fluid mechanics, combustion processes, and soot/gas concentrations. Deconvolution of these combined phenomena can facilitate the development of combustion and radiation models for use in predictive fire modeling. Therefore, in this work, line-of-sight spectral radiation intensities have been measured from a buoyant turbulent pool-fire-like ethylene diffusion flame. In an attempt to be representative of practical turbulent fires, a burner of 15.2 cm in diameter was used. Temporal measurements of radiation intensity were obtained with a fast (400 Hz) mid-infrared spectrometer at a horizontal plane located at half a burner diameter above the burner. Measurement statistics, including mean, root mean square (RMS), probability density function (PDF) of line of sight intensity, and intermittency are reported herein for wavelengths dominated by soot and CO2 radiation. The data show that radiation is affected by large-scale vortical motions, resulting in varying flame intermittency in the radial direction. Radial distributions of local scalar properties (temperature and soot volume fraction) were calculated through tomographic inversion, using measured data at multiple soot radiation wavelengths. The inversion technique was coupled with the results of a computational fluid dynamics (CFD) fire simulation code. CFD results were used to construct PDFs and spatial correlations for the scalars of interest. The estimated scalars are shown to be consistent with values from the literature, and mean and RMS radiation intensities computed from these scalars are in good agreement with measurements.  相似文献   

3.
4.
A new methodology, called moment matching, to efficiently estimate repair costs of a building due to future earthquake excitation is presented. As well as excitation uncertainties, other uncertainties considered include those in the structural model and those in the capacity to resist damage and the unit repair costs of structural and non-structural components. Given the first few moments of the basic uncertain variables, moment matching uses specially selected point estimates to propagate the uncertainties in order to more accurately estimate the first few moments of the repair costs. Two buildings are chosen as illustrative examples to demonstrate the use of moment matching: one hypothetical three-degree-of-freedom shear building, and a real seven-storey hotel building. It is shown that the moment matching technique is much more accurate than the First-Order Second-Moment approach when propagating the first two moments, whilst the computational cost is of the same order. The repair cost moments estimated by the moment matching technique are also compared to those obtained by the more computationally demanding Monte Carlo simulation, and it is concluded that as long as the order of the moment matching is sufficient, the comparison is satisfactory. Last, but not least, a procedure for sensitivity analysis is discussed and it is concluded that the most important uncertainties for the real building example are those that correspond to spectral acceleration, component capacity, ground motion details and unit repair costs.  相似文献   

5.
This paper describes the development and application of a method for estimating uncertainty in the prediction of sewer flow quantity and quality and how this may impact on the prediction of water quality failures in integrated catchment modelling (ICM) studies. The method is generic and readily adaptable for use with different flow quality prediction models that are used in ICM studies. Use is made of the elicitation concept, whereby expert knowledge combined with a limited amount of data are translated into probability distributions describing the level of uncertainty of various input and model variables. This type of approach can be used even if little or no site specific data is available. Integrated catchment modelling studies often use complex deterministic models. To apply the results of elicitation in a case study, a computational reduction method has been developed in order to determine levels of uncertainty in model outputs with a reasonably practical level of computational effort. This approach was applied to determine the level of uncertainty in the number of water quality failures predicted by an ICM study, due to uncertainty associated with input and model parameters of the urban drainage model component of the ICM. For a small case study catchment in the UK, it was shown that the predicted number of water quality failures in the receiving water could vary by around 45% of the number predicted without consideration of model uncertainty for dissolved oxygen and around 32% for unionised ammonia. It was concluded that the potential overall levels of uncertainty in the ICM outputs could be significant. Any solutions designed using modelling approaches that do not consider uncertainty associated with model input and model parameters may be significantly over-dimensioned or under-dimensioned. With changing external inputs, such as rainfall and river flows due to climate change, better accounting for uncertainty is required.  相似文献   

6.
The work described herein seeks to investigate a probabilistic framework to evaluate the fire resistance of structures given uncertainty in the fire load and structural resistance parameters. The methodology involves (i) the identification and characterization of uncertain parameters in the system, (ii) a stochastic analysis of the thermo-mechanical response of the structure, and (iii) the evaluation of structural reliability based on a suitable limit state function. The methodology is demonstrated through the analysis of a protected steel beam using Monte Carlo simulation with embedded finite element simulations. Model dimensionality is reduced using a response sensitivity analysis, and limit state functions are defined based on limiting deflection criteria used in fire resistance tests. Results demonstrate that the 1-h rated beam resists a natural fire exposure with a failure probability of less than ten percent, although additional discussion is warranted regarding what might be considered an acceptable level of risk in structural fire design. The study also demonstrates that probabilistic analysis of structural fire resistance provides an enhanced understanding of the factors affecting the resistance of structures to fire and offers a means for rationally improving structural designs to meet target performance objectives.  相似文献   

7.
In safety analysis of structures, classical probabilistic analysis has been a popular approach in engineering. However, it is not always to obtain sufficient information to model all uncertain parameters of structures system by probability theory, especially at early stage of design. Under this circumstance, probability theory (used to model random uncertainty) combined with evidence theory (used to model epistemic uncertainty) may be utilized in safety analysis of structures. This paper proposed a novel method for safety analysis of structures based on probability and evidence theory. Firstly, Bayes conversion method is used as the way for precision of evidence body, and the mean and variance of epistemic uncertain variables is defined. Then epistemic uncertainty variables is transformed to normal random variables by reflection transformation method, and the checking point method (J-C method) is used to solve most probability point and reliability. A numerical example and two engineering examples are given to demonstrate the performance of the proposed method. The results show both precision and computational efficiency of the method is high. Moreover, the proposed method provides basis for reliability-based optimization with the hybrid uncertainties.  相似文献   

8.
Information from full-scale fire tests are gathered and systemised. The knowledge from these tests is used as input to three different models, ranging from a simple spreadsheet model to advanced computational fluid dynamics (CFD) modelling, for calculating the temperature in the smoke layer. The deviation between the fire tests and the computed results is described and an evaluation of how this may influence the use of the models is discussed from the point of view of risk analysis.  相似文献   

9.
In this context,two different approaches of soil liquefaction evaluation using a soft computing technique based on the worldwide standard penetration test(SPT) databases have been studied.Gene expression programming(GEP) as a gray-box modeling approach is used to develop different deterministic models in order to evaluate the occurrence of soil liquefaction in terms of liquefaction field performance indicator(LI) and factor of safety(F_S) in logistic regression and classification concepts.The comparative plots illustrate that the classification concept-based models show a better performance than those based on logistic regression.In the probabilistic approach,a calibrated mapping function is developed in the context of Bayes' theorem in order to capture the failure probabilities(P_L) in the absence of the knowledge of parameter uncertainty.Consistent results obtained from the proposed probabilistic models,compared to the most well-known models,indicate the robustness of the methodology used in this study.The probability models provide a simple,but also efficient decision-making tool in engineering design to quantitatively assess the liquefaction triggering thresholds.  相似文献   

10.
Abstract: Several formalisms for representing and reasoning with uncertain knowledge have been proposed in the artificial intelligence literature. Unfortunately, analyses of the adequacy of each formalism to different types of problems have seldom appeared, and designers are often forced to make arbitrary choices about how to model uncertainty in their domain. In this paper, we present an experimental approach to comparing uncertainty management techniques in the light of a specific problem to solve. We model a problem tailored on a real-world application using three major techniques, namely, probability theory, Dempster-Shafer's theory, and possibility theory, and discuss the results. We also propose a new qualitative way of analyzing the behavior of the three techniques that highlights some interesting assumptions. The experiment has been performed using PULCINELLA, a tool for propagating uncertainty based on the local computation technique of Shafer and Shenoy that can be specialized to each of our target uncertainty formalisms.  相似文献   

11.
Sources and current methods of analysis of uncertainty from randomness, fuzziness and ignorance or incomplete knowledge in seismic hazard assessment problem are briefly discussed at beginning; understandings of the authors are then presented in the following order. All three types of uncertainty come from incomplete knowledge. Probabilistic method can be applied to all of them, objective probability for random factors and subjective probability for the other two types of uncertain factors. Discrete subjective probability mass functions for incomplete and fuzzy factors can be obtained from logic-tree and membership functions respectively. Fractile curves may be used to show the scattering of any uncertainty factor, but a unified probabilistic treatment may be applied to any combination of all three types of uncertainty.  相似文献   

12.
《Fire Safety Journal》2004,39(7):557-579
A simulation approach for ranking of fire safety attributes of buildings has been developed on the basis of Analytic Hierarchy Process (AHP) method. The proposed approach can assist fire safety professionals to evaluate the relative weighting of fire safety attributes of a building in the form of a hierarchy of references through a series of pairwise comparisons. It is particularly suitable for manipulating uncertain evaluation, which is dependent on the “quantity” and “quality” of available information. In this article, the priority ranking of each fire safety attribute given by each evaluator and his/her evaluation on each pairwise comparison are combined to construct an approximate probability density distribution. With reference to such a probability density distribution, we can generate evaluation for each pairwise comparison and pairwise comparison matrices. Moreover, statistical significance tests of the difference in the estimated priorities of the decision alternatives will be possible by these repeated measurements. The proposed method is practicality in that it demands limited computation but can still provide useful information, such as confidence of the ranking, for the evaluation process. Implementation of the proposed fire safety ranking evaluation approach for existing buildings can be carried out by experts’ judgments or using a computational approach to derive objective evaluation to assist the ranking evaluation of each attribute.  相似文献   

13.
基于世界范围内19次中强震的CPT液化案例,通过构建客观性较强的液化隶属函数映射地基液化模糊性;基于LOGISTIC模型导出地基抗液化强度CRR曲线及其概率密度函数;借用中国华北地区地震动加速度衰减关系,构建CSR的概率密度函数,进而分析地基液化评价的模糊不确定性.结果表明,忽略地基液化评价的模糊不确定性,评价结果可能偏于危险;基于模糊不确定性的地基液化评价工程意义较为明显,可作为规范方法的有益参考.  相似文献   

14.
Many computer fire models were developed in the literature with the rapid advancement of information technology. With the possibility of implementing engineering performance-based fire codes, fire models are used frequently in hazard assessment. Among the different approaches, fire field models using the technique of computational fluid dynamics (CFD) are widely used. The approach takes the advantage of predicting the fire environment in a ‘microscopic’ picture. Air flow pattern, pressure and temperature contours can be predicted. However, it is not easy to validate the CFD predicted results. Most of the field models are only validated by some experiments not specially designed for such purpose. There are very few studies on comparison with field measurements in actual sites. Whether those models are suitable for use are queried, leading to challenges. In this paper, the CFD tool fire dynamics simulator developed at the National Institute of Standards and Technology in USA will be applied to study atrium fires. Smoke layer interface height and air temperatures inside the atrium are simulated. The experimental data on atrium hot smoke tests carried out recently was used. CFD results predicted can be validated by comparing with the experimental results.  相似文献   

15.
Reliability of uncertain dynamical systems with multiple design points   总被引:2,自引:0,他引:2  
Asymptotic approximations and importance sampling methods are presented for evaluating a class of probability integrals with multiple design points that may arise in the calculation of the reliability of uncertain dynamical systems. An approximation based on asymptotics is used as a first step to provide a computationally efficient estimate of the probability integral. The importance sampling method utilizes information of the integrand at the design points to substantially accelerate the convergence of available importance sampling methods that use information from one design point only. Implementation issues related to the choice of importance sampling density and sample generation for reducing the variance of the estimate are addressed. The computational efficiency and improved accuracy of the proposed methods is demonstrated by investigating the reliability of structures equipped with a tuned mass damper for which multiple design points are shown to contribute significantly to the value of the reliability integral.  相似文献   

16.
Towards the development of a more rigorous approach for coupling collected fire scene data to computational tools, a Bayesian computational strategy is presented in this work. The Bayesian inversion technique is exercised on synthetic, time-integrated data to invert for the location, size, and time-to-peak of an unknown fire using two well-known forward models; Consolidated Model of Fire and Smoke Transport (CFAST) and Fire Dynamics Simulator (FDS). A Gaussian process surrogate model was fit to coarse FDS simulations to facilitate Markov Chain Monte Carlo sampling. The inversion framework was able to predict the total energy release by all fire cases except for one CFAST forward model, a 1000 kW steady fire. It was found that insufficient information was available in the time-integrated data to distinguish the temporal variations in peak times. FDS performed better than CFAST in predicting the maximum energy release rate with the posterior mean of the best configurations being 0.05% and 2.77% of the true values respectively. Both models performed equally well on locating the fire in a compartment.  相似文献   

17.
Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multi-objective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper provides the specific advantages and disadvantages of each method. In relation to computational efficiency (i.e. number of iterations required to generate the probability distribution of parameters), it was found that SCEM-UA and AMALGAM produce results quicker than GLUE in terms of required number of simulations. However, GLUE requires the lowest modelling skills and is easy to implement. All non-Bayesian methods have problems with the way they accept behavioural parameter sets, e.g. GLUE, SCEM-UA and AMALGAM have subjective acceptance thresholds, while MICA has usually problem with its hypothesis on normality of residuals. It is concluded that modellers should select the method which is most suitable for the system they are modelling (e.g. complexity of the model’s structure including the number of parameters), their skill/knowledge level, the available information, and the purpose of their study.  相似文献   

18.
The optimum design of a tuned liquid column damper (TLCD) considering system parameter uncertainty is usually performed by minimising the performance measure obtained by the total probability theory without any consideration to the variation of its performance due to parameter uncertainty. However, such a design method does not necessarily correspond to an optimum design in terms of maximum response reduction as well as its minimum dispersion. Furthermore, such approach cannot be applied in many real situations when the required detailed information about the uncertain parameters is limited. The robust design optimisation (RDO) of a TLCD system to mitigate seismic vibration effect in which the bounds on the magnitude of the uncertain properties of the structural and ground motion model parameters are only required is attempted in this study. The RDO is formulated as a two-criterion optimisation problem where the weighted sum of the maximum root mean square displacement of the structure and its dispersion is minimised. The conventional interval analysis-based bounded optimum solution is also obtained to demonstrate the effectiveness of the proposed RDO approach. A numerical study elucidates the effect of parameter uncertainty on the RDO of TLCD parameters by comparing the RDO results with the bounded optimum results.  相似文献   

19.
Abstract: The increased susceptibility of lifeline systems to failure due to aging and external hazards requires efficient methods to quantify their reliability and related uncertainty. Monte Carlo simulation techniques for network‐level reliability and uncertainty assessment usually require large computational experiments. Also, available analytical approaches apply mainly to simple network topologies, and are limited to providing average values, low order moments, or confidence bounds of reliability metrics. This study introduces a closed form technique to obtain the entire probability distribution of a reliability metric of customer service availability (CSA) for generic radial lifeline systems. A special case of this general formulation reduces to a simple sum of products equation, for which a recursive algorithm that exploits its structure is presented. This special‐case algorithm computes the probability mass function (PMF) of CSA for systems with M elements in operations, relative to conventional operations, and opens the possibility of finding recursive algorithms for the general radial case. Parametric models that approximate the CSA metric are also explored and their errors quantified. The proposed radial topology reliability assessment tools and resulting probability distributions provide infrastructure owners with critical insights for informed operation and maintenance decision making under uncertainty.  相似文献   

20.
Life cycle costing (LCC) is commonly undertaken deterministically, even though uncertainty is acknowledged to be present and acknowledged that it should be incorporated. However, no systematic review has been undertaken to explore how uncertainty might be dealt with in LCC. This paper reviews different approaches that acknowledge uncertainty within LCC, and in particular how such approaches treat uncertainty in the underlying financial variables of cash flows, interest rates, timing of cash flows, and LCC analysis duration. The approaches are categorised according to the use of, for example, frequency and probability distributions, fuzzy sets, moments, Markov chains, Bayesian thinking, artificial neural networks, pedigree matrices and composite and specialist approaches to characterise the variables. It is seen that: the most commonly used approaches for dealing with uncertainty in all LCC financial variables are probability distributions and fuzzy sets; some approaches are seen to be specific to one financial variable – the mechanistic-empirical approach for the timing of cash flows, and the gamma process for the duration; while two financial variables, namely cash flows and interest rates, attract the most attention from researchers. With no existing reviews solely exploring uncertainty within LCC or dealing with uncertainty in a systematic way, the paper fills a knowledge gap. The paper will be of interest to those involved with infrastructure LCC.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号