首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
基于不确定性传递,推导了考虑本质不确定性的地震易损性函数和考虑知识不确定性的地震易损性函数的解析表达式,并给出了其参数确定方法。证明了基于位移的函数与基于地震动强度的函数之间存在一致性。推导了考虑知识不确定性的地震易损性点估计函数的置信度水平。利用地震风险等于地震危险性与地震易损性的卷积,采用幂指数形式的地震危险性函数,进一步将已获得的地震易损性解析函数拓展得到概率地震风险函数的解析形式。利用得到的地震易损性与风险的解析函数,可简化地震易损性和风险评估的过程,方便对不同结构体系进行概率地震安全评估。  相似文献   

2.
The behaviour of structural systems can change during their service lives due to unexpected loadings, environmental effects and deterioration processes. In order to optimise maintenance interventions, the life cycle of a structure has to be properly assessed. Structural health monitoring (SHM) using collected experimental data provides a method for assessing structural behaviour over time. As the cost related to SHM is substantial, sometimes that monitoring is limited in space and time. However, the modelling of the structural behaviour using experimental data-sets is characterised by increased uncertainty both in the choice of the appropriate model (epistemic uncertainty) and in parameters estimation (aleatory uncertainty). This paper provides an original procedure to support decisions in the presence of epistemic uncertainty. The procedure provides the development of a credibility index able to catch, between two possible models, which is the most reliable to describe the evolution of parameters of interest. Considering as a case study the occurrence of a foundation settlement in an arch bridge, the efficacy of the model proposed is assessed. The approach can be applied to investigate the behaviour of other aspects of the life-cycle assessment: the evolution of structural resistance, the failure time of an element or of the whole system.  相似文献   

3.
Data collection is always related to costs and to investments of resources. Especially data collected by the fire services is often not collected in a systematic manner and often without a clear purpose. A more systematic approach is achieved by identifying the stakeholders or decision-makers who are intended to use or benefit from the data and bear the costs. Whether a survey is reasonable or not can be treated in the context of a decision-problem by performing a pre-posterior decision analysis in order to assess the value of information. Potential stakeholders are identified that can benefit from fire service data and an overview is provided on the type of information that can be obtained by a survey or information that results from engineering knowledge. Both information are associated with uncertainties and should be identified and quantified before the data is collected to limit the extent of the collection process. A hierarchical Bayesian probabilistic approach is proposed to combine both type of information by differentiate the uncertainties between aleatory and epistemic uncertainties. The data can be used to update epistemic uncertainties by applying Bayesian inference techniques. An example for the estimation of fire service intervention characteristics illustrates the approach and discusses how aleatory and epistemic uncertainties can be quantified.  相似文献   

4.
5.
Uncertainty propagation in probabilistic seismic loss estimation   总被引:2,自引:1,他引:1  
Probabilistic estimation of losses in a building due to earthquake damage is a topic of interest to decision makers and an area of active research. One promising approach to the problem, proposed by the Pacific Earthquake Engineering Research (PEER) Center, involves breaking the analysis into separate components associated with ground motion hazard, structural response, damage to components and repair costs. Each stage of this method has both inherent (aleatory) randomness and (epistemic) model uncertainty, and these two sources of uncertainty must be propagated through the analysis in order to determine the total uncertainty in the resulting loss estimates. In this paper, the PEER framework for seismic loss estimation is reviewed and options for both characterizing and propagating the various sources of uncertainty are proposed. Models for correlations (among, e.g., element repair costs) are proposed that may be useful when empirical data is lacking. Several options are discussed for propagating uncertainty, ranging from flexible but expensive Monte Carlo simulation to closed form solutions requiring specific functional forms for relationships between variables to be assumed. A procedure that falls between these two extremes is proposed, which integrates over the discrete element damage states, and uses the first-order second-oment method to collapse several conditional random variables into a single conditional random variable representing total repair cost given the ground motion intensity. Numerical integration is then used to incorporate the ground motion hazard. Studies attempting to characterize epistemic uncertainty or develop specific elements of the framework are referenced as an aid for users wishing to implement this loss-estimation procedure.  相似文献   

6.
A procedure for the optimum structural design of cable-stayed bridges is proposed based on minimum expected life-cycle cost (LCC); the procedure is illustrated with the optimum design of a cable-stayed bridge subjected to static and earthquake loads. Reliability analysis of the bridge is performed taking into account the two types of uncertainty in the capacity and loads. The capacity of the bridge is assumed to be determined by its critical members; this is tantamount to the assumption that the capacities and load effects of the structural members are highly correlated. Various designs of a cable-stayed bridge are considered; namely, a standard design, plus several that are weaker as well as several that are stronger than the standard design. For the different alternative designs, the member sections are decreased or increased relative to those of the standard design. The LCC of a particular design is formulated assuming that the cost components (including the maintenance and social costs) are respectively fractions of the initial cost. Reliability of a design associated with the aleatory uncertainties is assessed for each design, and the corresponding expected LCC and safety index are evaluated. The results of the various designs provide the information, safety index vs expected LCC, for determining the design with the minimum expected LCC which can be presented graphically. Because of the epistemic type of uncertainty, the LCC as well as the safety index of the optimum design are random variables; the respective histograms are also generated, from which the various percentile values can be obtained. Especially, the 75% and 90% values of the LCC may be specified to minimize the chance of underestimating the actual LCC of the optimum design; similarly the 75% and 90% values of the safety index may be specified for a conservative design of the cable-stayed bridge.  相似文献   

7.
In safety analysis of structures, classical probabilistic analysis has been a popular approach in engineering. However, it is not always to obtain sufficient information to model all uncertain parameters of structures system by probability theory, especially at early stage of design. Under this circumstance, probability theory (used to model random uncertainty) combined with evidence theory (used to model epistemic uncertainty) may be utilized in safety analysis of structures. This paper proposed a novel method for safety analysis of structures based on probability and evidence theory. Firstly, Bayes conversion method is used as the way for precision of evidence body, and the mean and variance of epistemic uncertain variables is defined. Then epistemic uncertainty variables is transformed to normal random variables by reflection transformation method, and the checking point method (J-C method) is used to solve most probability point and reliability. A numerical example and two engineering examples are given to demonstrate the performance of the proposed method. The results show both precision and computational efficiency of the method is high. Moreover, the proposed method provides basis for reliability-based optimization with the hybrid uncertainties.  相似文献   

8.
Many planning and production processes are characterized by uncertain data and uncertain information. For realistic modeling of such processes these uncertainties have to be considered. The new approach presented in this paper takes epistemic uncertainty into account, for which fuzzy set theory is applicable. In some cases it is possible and useful to reduce epistemic uncertainty by additional monetary investments. It is postulated that uncertain forecast values, e.g. expected safety, quality, or the completion date of a structure, can be improved or scheduled more precisely by a higher investment. Aim of the presented cost-effectiveness fuzzy analysis is the evaluation of the effectiveness of monetary investments on the reduction of uncertainty of the analyzed forecast values.  相似文献   

9.
Abstract

Previous design research has demonstrated how epistemic uncertainty engenders localised, creative reasoning, including analogising and mental simulation. We analysed not just the short-term, localised effects of epistemic uncertainty on creative processing and information selection, but also its long-term impact on downstream creative processes. Our hypothesis was that heightened levels of uncertainty associated with a particular cognitive referent would engender: (1) immediate creative elaboration of that referent aimed at resolving uncertainty and determining information selection; and (2) subsequent attentive returns to that cognitive referent at later points in time, aimed at resolving lingering uncertainty and determining information selection. Findings: First—contrary to expectations—we observed that increased epistemic certainty (rather than increased epistemic uncertainty) in relation to cognitive referents triggered immediate, creative reasoning and information elaboration. Second, epistemic uncertainty was, as predicted, found to engender subsequent attentive returns to cognitive referents. Third, although epistemic uncertainty did not predict the selection of information, both immediate creative elaboration and subsequent attentive returns did predict information selection, with subsequent attentive returns being the stronger predictor. Our findings hold promise for identifying more global impacts of epistemic uncertainty on creative design cognition possibly mediated through the establishment of lasting associations with cognitive referents.  相似文献   

10.
The purpose of the present paper is to develop a simple methodology for seismic life cycle cost (LCC) estimation for a steel jacket offshore platform structure. This methodology accounts for accuracy of LCC modelling as well as simplicity of application. Accuracy is maintained through incorporating the effect of aleatory and epistemic uncertainties in the LCC estimation framework. Simplicity is achieved by using equivalent single-degree-of-freedom (ESDOF) system instead of the full structure and by eliminating full incremental dynamic analysis and fragility analysis. Instead, an approximate fragility curve and a localised incremental dynamic analysis curve are used along with a probabilistic simple closed-form solution for loss estimation. In the design of model structures, different bracing systems are used for the seismic design of the offshore platform, such as conventional and buckling-restrained braces. The proposed LCC methodology is validated through comparison with the results from a more rigorous method. It is found that even though the proposed methodology results in a slightly different solution compared to the reference method, the method can be used as an efficient tool for preliminary LCC evaluation of structures.  相似文献   

11.
This study was intended to efficiently perform the probabilistic safety and optimal design assessment of steel cable-stayed bridges (SCS bridges) using stochastic finite element analysis (SFEA) and expected life-cycle cost (LCC) concept. To that end, advanced probabilistic finite element algorithm (APFEA) which enables to execute the static and dynamic SFEA considering aleatory uncertainties contained in random variable was developed. APFEA is the useful analytical means enabling to conduct the reliability assessment (RA) in a systematic way by considering the result of SFEA based on linearity and nonlinearity of before or after introducing initial tensile force. Appropriateness of APFEA was verified in such a way of comparing the result of SFEA of a simple structure and the result of numerical analysis using Monte Carlo Simulation (MCS) program. The probabilistic method of SCS bridges was set, taking into account of analytical parameters. The dynamic response characteristic by probabilistic method was evaluated using ASFEA, and RA was carried out based on analysis result, thereby quantitatively calculating the probabilistic safety. The optimal design of SCS bridges was determined based on the expected LCC according to the results of SFEA and RA of alternative designs. Moreover, given the potential epistemic uncertainty contained in safety index, failure probability and minimum LCC, the sensitivity analysis was conducted and as a result, a critical distribution phase was illustrated using a cumulative-percentile.  相似文献   

12.
Based on the multiple stripes analysis method and the first‐order second‐moment method, a seismic collapse risk assessment considering the modeling uncertainty is carried out for a 118‐story super high‐rise building with a typical mega‐frame/core‐tube/outrigger resisting system. The sensitivity of the median collapse capacity of the building to eight main parameters is analyzed, and then the modeling uncertainty is determined. Both the effects of the characterization methods of bidirectional ground motion intensities and the selection of the ground motion intensity measure (IM) on the aleatory randomness are investigated. The mean estimates approach and the confidence interval method are used to incorporate both the modeling uncertainty and the aleatory randomness, and then the annual collapse probability, the collapse probability at the maximum considered earthquake (MCE) intensity level and the acceptable values of the collapse margin ratios (CMRs) with different confidence levels for the building are calculated. The results show that the influence of the modeling uncertainty on the collapse capacity of the super high‐rise structure is negligible, the aleatory randomness caused by the record‐to‐record variability is significant, and an appropriate ground motion IM can significantly reduce the aleatory randomness.  相似文献   

13.
This paper examines how calibration performs under different levels of uncertainty in model input data. It specifically assesses the efficacy of Bayesian calibration to enhance the reliability of EnergyPlus model predictions. A Bayesian approach can be used to update uncertain values of parameters, given measured energy-use data, and to quantify the associated uncertainty. We assess the efficacy of Bayesian calibration under a controlled virtual-reality setup, which enables rigorous validation of the accuracy of calibration results in terms of both calibrated parameter values and model predictions. Case studies demonstrate the performance of Bayesian calibration of base models developed from audit data with differing levels of detail in building design, usage, and operation.  相似文献   

14.
One of the objectives in performance-based earthquake engineering is to quantify the seismic reliability of a structure due to future random earthquakes at a designated site. For that purpose, two performance evaluation processes that do incorporate the effect of aleatory and epistemic uncertainties are illustrated and used in order to calculate the reliability of different height Special Moment Resisting frames through two probabilistic-based measures. These two measures are the confidence levels for satisfying the desired performance levels at given hazard levels and mean annual frequency of exceeding a specified structural capacity.Analytical models are employed including panel zone and a comprehensive model for structural components that not only include strength and stiffness degradation in back bone curve, but also incorporate gradual deterioration of strength and stiffness under cyclic loading. Incremental dynamic analysis is then utilized to assess the structural dynamic behavior of the frames and to generate required data for performance based evaluations. This research is intended to contribute to the progress in improvement of the performance knowledge on seismic design and evaluation of special steel moment resisting frame structures.  相似文献   

15.
The main drawback of conventional braced frames is implicitly accepting structural damage under the design earthquake load, which leads to considerable economic losses. Controlled rocking self-centering system as a modern low-damage system is capable of minimizing the drawbacks of conventional braced frames. This paper quantifies main limit states and investigates the seismic performance of self-centering braced frame using a Probabilistic Safety Assessment procedure. Margin of safety, confidence level, and mean annual frequency of the self-centering archetypes for their main limit states, including PT yield, fuse fracture, and global collapse, are established and are compared with their acceptance criteria. Considering incorporating aleatory and epistemic uncertainties, the efficiency of the system is examined. Results of the investigation indicate that the design of low- and mid-rise self-centering archetypes could provide the adequate margin of safety against exceeding the undesirable limit-states.  相似文献   

16.
Performance-based seismic design can generate predictable structure damage result with given seismic hazard. However, there are multiple sources of uncertainties in the seismic design process that can affect desired performance predictability. This paper mainly focuses on the effects of near-fault pulse-like ground motions and the uncertainties in bridge modeling on the seismic demands of regular continuous highway bridges. By modeling a regular continuous bridge with OpenSees software, a series of nonlinear dynamic time-history analysis of the bridge at three different site conditions under near-fault pulse-like ground motions are carried out. The relationships between different Intensity Measure (IM) parameters and the Engineering Demand Parameter (EDP) are discussed. After selecting the peak ground acceleration as the most correlated IM parameter and the drift ratio of the bridge column as the EDP parameter, a probabilistic seismic demand model is developed for near-fault earthquake ground motions for 3 different site conditions. On this basis, the uncertainty analysis is conducted with the key sources of uncertainty during the finite element modeling. All the results are quantified by the “swing” base on the specific distribution range of each uncertainty parameter both in near-fault and far-fault cases. All the ground motions are selected from PEER database, while the bridge case study is a typical regular highway bridge designed in accordance with the Chinese Guidelines for Seismic Design of Highway Bridges. The results show that PGA is a proper IM parameter for setting up a linear probabilistic seismic demand model; damping ratio, pier diameter and concrete strength are the main uncertainty parameters during bridge modeling, which should be considered both in near-fault and far-fault ground motion cases.  相似文献   

17.
Life cycle costing (LCC) is commonly undertaken deterministically, even though uncertainty is acknowledged to be present and acknowledged that it should be incorporated. However, no systematic review has been undertaken to explore how uncertainty might be dealt with in LCC. This paper reviews different approaches that acknowledge uncertainty within LCC, and in particular how such approaches treat uncertainty in the underlying financial variables of cash flows, interest rates, timing of cash flows, and LCC analysis duration. The approaches are categorised according to the use of, for example, frequency and probability distributions, fuzzy sets, moments, Markov chains, Bayesian thinking, artificial neural networks, pedigree matrices and composite and specialist approaches to characterise the variables. It is seen that: the most commonly used approaches for dealing with uncertainty in all LCC financial variables are probability distributions and fuzzy sets; some approaches are seen to be specific to one financial variable – the mechanistic-empirical approach for the timing of cash flows, and the gamma process for the duration; while two financial variables, namely cash flows and interest rates, attract the most attention from researchers. With no existing reviews solely exploring uncertainty within LCC or dealing with uncertainty in a systematic way, the paper fills a knowledge gap. The paper will be of interest to those involved with infrastructure LCC.  相似文献   

18.
Owners of housing stocks require reliable and flexible tools to assess the impact of retrofits technologies. Bottom-up engineering-based housing stock models can help to serve such a function. These models require calibrating, using micro-level energy measurements at the building level, to improve model accuracy; however, the only publicly available data for the UK housing stock is at the macro-level, at the district, urban, or national scale. This paper outlines a method for using macro-level data to calibrate micro-level models. A hierarchical framework is proposed, utilizing a combination of regression analysis and Bayesian inference. The result is a Bayesian regression method that generates estimates of the average energy use for different dwelling types whilst quantifying uncertainty in both the empirical data and the generated energy estimates. Finally, the Bayesian regression method is validated and the use of the hierarchical Bayesian calibration framework is demonstrated.  相似文献   

19.
We propose the use of Bayesian hierarchical/multilevel ratio approach to estimate the annual riverine phosphorus loads in the Saginaw River, Michigan, from 1968 to 2008. The ratio estimator is known to be an unbiased, precise approach for differing flow-concentration relationships and sampling schemes. A Bayesian model can explicitly address the uncertainty in prediction by using a posterior predictive distribution, while in comparison, a Bayesian hierarchical technique can overcome the limitation of interpreting the estimated annual loads inferred from small sample sizes by borrowing strength from the underlying population shared by the years of interest. Thus, by combining the ratio estimator with the Bayesian hierarchical modeling framework, long-term loads estimation can be addressed with explicit quantification of uncertainty. Our study results indicate a slight decrease in total phosphorus load early in the series. The estimated ratio parameter, which can be interpreted as flow-weighted concentration, shows a clearer decrease, damping the noise that yearly flow variation adds to the load. Despite the reductions, it is not likely that Saginaw Bay meets with its target phosphorus load, 440 tonnes/yr. Throughout the decades, the probabilities of the Saginaw Bay not complying with the target load are estimated as 1.00, 0.50, 0.57 and 0.36 in 1977, 1987, 1997, and 2007, respectively. We show that the Bayesian hierarchical model results in reasonable goodness-of-fits to the observations whether or not individual loads are aggregated. Also, this modeling approach can substantially reduce uncertainties associated with small sample sizes both in the estimated parameters and loads.  相似文献   

20.
From a reliability viewpoint, the simple conventional procedure for establishing design SN curves from laboratory fatigue test data suffers from two important limitations. Firstly, the calculated fatigue reliability on the design curve only reflects the observed “physical” uncertainty associated with the fatigue process itself. “Statistical” uncertainties, connected with estimating the parameters of the fatigue model, are not considered. Secondly, it is not possible to account for fatigue run-outs (non-failures) in a rational manner.A new procedure, based on Bayesian statistical inference, is presented which is capable of handling both the above problems. Its use is illustrated via the analysis of some typical fatigue data sets and a number of general points arising from the results are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号