首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We consider the basic delay-time model in which a system has three states, the perfect functioning state, a defective state and the failure state. The system is deteriorating and to reduce the number of failures, preventive replacements are carried out when the system is in the defective state. The time in the defective state is referred to as the delay time. Inspections are required to check whether the system is in the defective state. System failures are safety critical and to control the risk, management considers two types of safety constraints: (i) the probability of at least one failure in the interval [0,A] should not exceed a fixed probability ω1 and (ii) the fraction of time the system is in the defective state should not exceed a fixed limit ω2. The problem is to determine optimal inspection intervals T, minimizing the expected discounted costs under the safety constraints. Conditions are established for when the safety constraints affect the optimal inspection time and causes increased costs.  相似文献   

2.
Environmental accounting techniques are intended to capture important environmental costs and benefits that are often overlooked in standard accounting practices. Environmental accounting methods themselves often ignore or inadequately represent large but highly uncertain environmental costs and costs conditioned by specific prior events. Use of a predictive Bayesian model is demonstrated for the assessment of such highly uncertain environmental and contingent costs. The predictive Bayesian approach presented generates probability distributions for the quantity of interest (rather than parameters thereof). A spreadsheet implementation of a previously proposed predictive Bayesian model, extended to represent contingent costs, is described and used to evaluate whether a firm should undertake an accelerated phase-out of its PCB containing transformers. Variability and uncertainty (due to lack of information) in transformer accident frequency and severity are assessed simultaneously using a combination of historical accident data, engineering model-based cost estimates, and subjective judgement. Model results are compared using several different risk measures. Use of the model for incorporation of environmental risk management into a company's overall risk management strategy is discussed.  相似文献   

3.
Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events – age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S) – in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.  相似文献   

4.
This paper aims at predicting cycling accident risk for an entire network and identifying how road infrastructure influences cycling safety in the Brussels-Capital Region (Belgium). A spatial Bayesian modelling approach is proposed using a binary dependent variable (accident, no accident at location i) constructed from a case–control strategy. Control sites are sampled along the ‘bikeable’ road network in function of the potential bicycle traffic transiting in each ward. Risk factors are limited to infrastructure, traffic and environmental characteristics.  相似文献   

5.
It is well known that the London model (valid for a hard type II superconductor) predicts an externally applied magnetic field decays exponentially as a function of depth into the superconductor on a length scale λ. Direct measurements of the field profile using low energy μSR on high-Tc superconductors, such as YBa2Cu3 O x measure deviations from a simple exponential decay. In particular, there is a short length scale δ close to the surface where the magnetic field does not decay. It has been proposed that this is due to surface roughness, which leads to a suppression of the order parameter near the surface. A model of surface roughness has been studied for the case of a sinusoidally modulated surface roughness on an isotropic superconductor showing that in some cases the profiles resulting from surface roughness may be qualitatively similar to the dead layer phenomena in that the field magnitude decay rate may be slowed near the surface relative to a flat interface but that for modest roughness, the quantitative value of the length over which the field decay is slowed is much smaller than the experiments measure. In this paper, we extend this work in two directions: firstly, by using atomic force microscopy data of YBa2Cu3 O x crystals, we predict the expected field profiles within the context of the Isotropic London model of superconductivity given their actual surface geometry; and secondly, we consider how surface roughness could affect experimental values for λ and δ. The main finding is that roughness within an isotropic model does not produce the dead layers found in experiments on YBa2Cu3 O x . However, we suggest that roughness in a highly anisotropic superconductor could account for the observed dead layer.  相似文献   

6.
This paper focuses on how access to an insurance market should influence investments in safety measures in accordance with the ruling paradigm for decision-making under uncertainty—the expected utility theory. We show that access to an insurance market in most situations will influence investments in safety measures. For an expected utility maximizer, an overinvestment in safety measures is likely if access to an insurance market is ignored, while an underinvestment in safety measures is likely if insurance is purchased without paying attention to the possibility for reducing the probability and/or consequences of an accidental event by safety measures.  相似文献   

7.
Since the safety professionals are the key decision makers dealing with project safety and risk assessment in the construction industry, their perceptions of safety risk would directly affect the reliability of risk assessment. The safety professionals generally tend to heavily rely on their own past experiences to make subjective decisions on risk assessment without systematic decision making. Indeed, understanding of the underlying principles of risk assessment is significant. In this study, the qualitative analysis on the safety professionals’ beliefs of risk assessment and their perceptions towards risk assessment, including their recognitions of possible accident causes, the degree of differentiations on their perceptions of risk levels of different trades of works, recognitions of the occurrence of different types of accidents, and their inter-relationships with safety performance in terms of accident rates will be explored in the Stage 1.  相似文献   

8.
We present and analyze a model of an evolving sandpile surface in (2 + 1) dimensions where the dynamics of mobile grains (ρ(x, t)) and immobile clusters (h(x, t)) are coupled. Our coupling models the situation where the sandpile is flat on average, so that there is no bias due to gravity. We find anomalous scaling: the expected logarithmic smoothing at short length and time scales gives way to roughening in the asymptotic limit, where novel and non-trivial exponents are found.  相似文献   

9.
The primary objective of the present study was to investigate the predictability of crash risk models that were developed using high-resolution real-time traffic data. More specifically the present study sought answers to the following questions: (a) how to evaluate the predictability of a real-time crash risk model; and (b) how to improve the predictability of a real-time crash risk model. The predictability is defined as the crash probability given the crash precursor identified by the crash risk model. An equation was derived based on the Bayes’ theorem for estimating approximately the predictability of crash risk models. The estimated predictability was then used to quantitatively evaluate the effects of the threshold of crash precursors, the matched and unmatched case-control design, and the control-to-case ratio on the predictability of crash risk models. It was found that: (a) the predictability of a crash risk model can be measured as the product of prior crash probability and the ratio between sensitivity and false alarm rate; (b) there is a trade-off between the predictability and sensitivity of a real-time crash risk model; (c) for a given level of sensitivity, the predictability of the crash risk model that is developed using the unmatched case-controlled sample is always better than that of the model developed using the matched case-controlled sample; and (d) when the control-to-case ratio is beyond 4:1, the increase in control-to-case ratio does not lead to clear improvements in predictability.  相似文献   

10.
The mining trade involves many complicated and interrelated variables—its complex environment, abundant machinery and a plethora of other contributors to accidents. In both developed and developing countries, mining accidents have caused many casualties. However, a universal risk assessment method for mining accidents is has not yet been implemented. Among risk assessment methods, the bow-tie has been used in different industry processes and systems and has proven effective. In this paper, the bow-tie model is utilized to investigate the relationship among mining accident risks, safety measures and possible consequences. The paper illustrates the hazards of mining accidents using US mine accident data. It also shows how the consequences of mine accidents are summarized by laws and regulations of different countries. This paper also introduces a series safety measures from Chinese safety standards and how the safety measures prevent and mitigate risks. At the end of the paper, a case of mine water inrush is applied using the bow-tie approach. The results show that the method is effective for analyzing mine safety.  相似文献   

11.
This paper addresses the modeling of probability of dangerous failure on demand and spurious trip rate of safety instrumented systems that include MooN voting redundancies in their architecture. MooN systems are a special case of k-out-of-n systems. The first part of the article is devoted to the development of a time-dependent probability of dangerous failure on demand model with capability of handling MooN systems. The model is able to model explicitly common cause failure and diagnostic coverage, as well as different test frequencies and strategies. It includes quantification of both detected and undetected failures, and puts emphasis on the quantification of common cause failure to the system probability of dangerous failure on demand as an additional component. In order to be able to accommodate changes in testing strategies, special treatment is devoted to the analysis of system reconfiguration (including common cause failure) during test of one of its components, what is then included in the model. Another model for spurious trip rate is also analyzed and extended under the same methodology in order to empower it with similar capabilities. These two models are powerful enough, but at the same time simple, to be suitable for handling of dependability measures in multi-objective optimization of both system design and test strategies for safety instrumented systems. The level of modeling detail considered permits compliance with the requirements of the standard IEC 61508. The two models are applied to brief case studies to demonstrate their effectiveness. The results obtained demonstrated that the first model is adequate to quantify time-dependent PFD of MooN systems during different system states (i.e. full operation, test and repair) and different MooN configurations, which values are averaged to obtain the PFDavg. Also, it was demonstrated that the second model is adequate to quantify STR including spurious trips induced by internal component failure and by test itself. Both models were tested for different architectures with 1≤N≤5 and 2≤M≤5 subject to uniform staggered test. The results obtained also showed the effects that modifying M and N has on both PFDavg and STR, and also demonstrated the conflicting nature of these two measures with respect to one another.  相似文献   

12.
Optimal use of warning signs in traffic   总被引:1,自引:0,他引:1  
The aim of the paper is to develop a model of drivers’ behaviour particularly designed to analyse the safety and total driving cost implications of warning sign installations. One special feature of the model is that it makes a clear distinction between drivers’ perceived risk values at certain speeds and their respective objective values. When focusing on a certain stretch of road only, the paper concludes that warning signs will increase safety and probably reduce total objective driving costs; that is the sum of time costs and objective expected accident costs. Since drivers’ speed will reduce implying higher time costs per distance, the reduction in total objective driving costs will be lower than the reductions in accident costs. The analysis is then extended to comprise the whole road system and using warning signs prior to curves as an example. Besides the driving conditions in different curves, the analysis shows that the optimal number of signs is dependent on the road authorities’ objectives for road traffic and on how drivers form their risk perceptions. Generally speaking, simulations indicate that the safety and economic benefits of warning sign installation are not very high. When considering the whole road system, warning signs seem, however, to have a greater positive impact on total driving costs than on accident costs.  相似文献   

13.
A methodology for predicting the probability of human task reliability during a task sequence is described. The method is based on a probabilistic performance requirement–resource consumption model. This enables error-promoting conditions in accident scenarios to be modelled explicitly and a time-dependent probability of error to be estimated. Particular attention is paid to modelling success arising from underlying human learning processes and the impact of limited resources. The paper describes the principles of the method together with an example related to safety and risk of a diver in the wreck scenario. © 1998 John Wiley & Sons, Ltd.  相似文献   

14.
The least-squares analysis of data with error in x and y is generally thought to yield best results when the quantity minimized is the sum of the properly weighted squared residuals in x and in y. As an alternative to this “total variance” (TV) method, “effective variance” (EV) methods convert the uncertainty in x into an effective contribution to that in y, and though easier to use are considered to be less reliable. There are at least two EV methods, differing in how the weights are treated in the optimization. One of these is identical to the TV method for fits to a straight line. The formal differences among these methods are clarified, and Monte Carlo simulations are used to examine the statistical properties of each on the widely used straight-line model of York, a quadratic variation on this, Orear's hyperbolic model, a nonlinear binding (Langmuir) model, and Wentworth's kinetics model. The simulations confirm that the EV and TV methods are statistically equivalent in the limit of small data error, where they yield unbiased, normally distributed parameter estimates, with standard errors correctly predicted by the a priori covariance matrix. With increasing data error, these properties fail to hold; and the TV method is not always statistically best. Nonetheless, the method differences should seldom be of practical significance, since they are likely to be small compared with uncertainties from incomplete information about the data error in x and y.  相似文献   

15.
Over the last two decades a growing interest for risk analysis has been noted in the industries. The ARAMIS project has defined a methodology for risk assessment. This methodology has been built to help the industrialist to demonstrate that they have a sufficient risk control on their site.

Risk analysis consists first in the identification of all the major accidents, assuming that safety functions in place are inefficient. This step of identification of the major accidents uses bow–tie diagrams. Secondly, the safety barriers really implemented on the site are taken into account. The barriers are identified on the bow–ties. An evaluation of their performance (response time, efficiency, and level of confidence) is performed to validate that they are relevant for the expected safety function. At last, the evaluation of their probability of failure enables to assess the frequency of occurrence of the accident. The demonstration of the risk control based on a couple gravity/frequency of occurrence is also possible for all the accident scenarios.

During the risk analysis, a practical tool called risk graph is used to assess if the number and the reliability of the safety functions for a given cause are sufficient to reach a good risk control.  相似文献   


16.
Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (Pnc) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit–cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the Pnc provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I + F) and property damage only (PDO) collisions.  相似文献   

17.
Defense-in-depth is a fundamental principle/strategy for achieving system safety. First conceptualized within the nuclear industry, defense-in-depth is the basis for risk-informed decisions by the U.S. Nuclear Regulatory Commission, and is recognized under various names in other industries (e.g., layers of protection in the Chemical industry). Accidents typically result from the absence or breach of defenses or violation of safety constraints. Defense-in-depth is realized by a diversity of safety barriers and a network of redundancies. However, this same redundancy and the intrinsic nature of defense-in-depth - the multiple lines of defense or “protective layers” along a potential accident sequence - may enhance mechanisms concealing the occurrence of incidents, or that the system has transitioned to a hazardous state (accident pathogens) and that an accident is closer to being released. Consequently, the ability to safely operate the system may be hampered and the efficiency of defense-in-depth may be degraded or worse may backfire. Several accidents reports identified hidden failures or degraded observability of accidents pathogens as major contributing factors.In this work, we begin to address this potential theoretical deficiency in defense-in-depth by bringing concepts from Control Theory and Discrete Event Systems to bear on issues of system safety and accident prevention. We introduce the concepts of controllability, observability, and diagnosability, and frame the current understanding of system safety as a “control problem” handled by defense-in-depth and safety barriers (or safety constraints). Observability and diagnosability are information-theoretic concepts, and they provide important complements to the energy model of accident causation from which the defense-in-depth principle derives. We formulate a new safety-diagnosability principle for supporting accident prevention, and propose that defense-in-depth be augmented with this principle, without which defense-in-depth can degenerate into a defense-blind safety strategy. Finally, we provide a detailed discussion and illustrative modeling of the sequence of events that lead to the BP Texas City Refinery accident in 2005 and emphasize how a safety-diagnosable architecture of the refinery could have supported the prevention of this accident or mitigated its consequences. We hope the theoretical concepts here introduced and the safety-diagnosability principle become useful additions to the intellectual toolkit of risk analysts and safety professionals and stimulate further interaction/collaboration between the control and safety communities.  相似文献   

18.
Accurate estimation of the expected number of crashes at different severity levels for entities with and without countermeasures plays a vital role in selecting countermeasures in the framework of the safety management process. The current practice is to use the American Association of State Highway and Transportation Officials’ Highway Safety Manual crash prediction algorithms, which combine safety performance functions and crash modification factors, to estimate the effects of safety countermeasures on different highway and street facility types. Many of these crash prediction algorithms are based solely on crash frequency, or assume that severity outcomes are unchanged when planning for, or implementing, safety countermeasures. Failing to account for the uncertainty associated with crash severity outcomes, and assuming crash severity distributions remain unchanged in safety performance evaluations, limits the utility of the Highway Safety Manual crash prediction algorithms in assessing the effect of safety countermeasures on crash severity. This study demonstrates the application of a propensity scores-potential outcomes framework to estimate the probability distribution for the occurrence of different crash severity levels by accounting for the uncertainties associated with them. The probability of fatal and severe injury crash occurrence at lighted and unlighted intersections is estimated in this paper using data from Minnesota. The results show that the expected probability of occurrence of fatal and severe injury crashes at a lighted intersection was 1 in 35 crashes and the estimated risk ratio indicates that the respective probabilities at an unlighted intersection was 1.14 times higher compared to lighted intersections. The results from the potential outcomes-propensity scores framework are compared to results obtained from traditional binary logit models, without application of propensity scores matching. Traditional binary logit analysis suggests that the probability of occurrence of severe injury crashes is higher at lighted intersections compared to unlighted intersections, which contradicts the findings obtained from the propensity scores-potential outcomes framework. This finding underscores the importance of having comparable treated and untreated entities in traffic safety countermeasure evaluations.  相似文献   

19.
20.
《Thin solid films》1986,145(1):89-97
Silicon oxide films (SiOx(0≲x≲2)) were deposited onto Inconel 617 alloy for the purpose of corrosion protection in an impure helium environment. The protective behaviour of the deposited films was examined as a function of their chemical composition. The hypostoichiometric SiOx (x < 2) coatings showed poor protective effects. Rather, they enhanced carburization of the Inconel 617 substrate. This is because the interdiffusion of silicon and nickel is faster than the oxidation of SiOx to form protective SiO2. In the helium environment used, the rate of supply of oxygen was quite low. Stoichiometric SiO2 coatings, however, showed good protective qualities. They protected the Inconel 617 substrate from carburization and from selective oxidation at 1170 and 1270 K for 200 h. However, some deterioration in the protective effect is expected for longer exposure to this environment at 1270 K.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号