首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper studies the risk assessment problem of the direct delivery business from local oil refineries in Sinopec Group. A total of 23 risk factors associated with four segments of the direct delivery business are first identified. Through explaining the respective characteristics, the connotation of each risk factor is analysed in depth. Next, on the basis of the severity and possibility of each risk factor, a multistage risk assessment method for normal cloud model rooted in the extended TOPSIS approach is developed, and then applied to a real-world case. From the investigation, the weaknesses of the present risk assessment process are addressed from various aspects, including the risk factors, segments, and alternatives. Moreover, considering the possible correlation among risk factors, the proposed method is further extended by using the approach of Choquet integral. Additional discussions and recommendations are provided for improving the risk management process of the direct delivery business from local oil refineries.  相似文献   

2.
A Decision Tree (DT) approach to build empirical models for use in Monte Carlo reliability evaluation is presented. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replacing the Evaluation Function (EF) by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated with two systems of different size, represented by their equivalent networks. The robustness of the DT approach as an approximated method to replace the EF is also analysed. Excellent system reliability results are obtained by training a DT with a small amount of information.  相似文献   

3.
The software reliability modeling is of great significance in improving software quality and managing the software development process. However, the existing methods are not able to accurately model software reliability improvement behavior because existing single model methods rely on restrictive assumptions and combination models cannot well deal with model uncertainties. In this article, we propose a Bayesian model averaging (BMA) method to model software reliability. First, the existing reliability modeling methods are selected as the candidate models, and the Bayesian theory is used to obtain the posterior probabilities of each reliability model. Then, the posterior probabilities are used as weights to average the candidate models. Both Markov Chain Monte Carlo (MCMC) algorithm and the Expectation–Maximization (EM) algorithm are used to evaluate a candidate model's posterior probability and for comparison purpose. The results show that the BMA method has superior performance in software reliability modeling, and the MCMC algorithm performs better than EM algorithm when they are used to estimate the parameters of BMA method.  相似文献   

4.
Various adaptive reliability analysis methods based on surrogate models have recently been developed. A multi-mode failure boundary exploration and exploitation framework (MFBEEF) was proposed for system reliability assessment using the adaptive kriging model based on sample space partitioning to reduce computational cost and use the characteristics of the failure boundary in multiple failure mode systems. The efficiency of the adaptive construction of kriging model can be improved by using the characteristics of the center sample of the small space to represent the characteristics of all samples in the small space. This method proposes a failure boundary exploration and exploitation strategy and a convergence criterion based on the maximum failure probability error for a system with multiple failure modes to adaptively approximate the failure boundary of a system with multiple failure modes. A multiple-failure-mode learning function was used to identify the optimal training sample to gradually update the kriging model during the failure boundary exploration and exploitation stages. In addition, a complex failure boundary-oriented adaptive hybrid importance sampling method was developed to improve the applicability of the MFBEEF method to small failure probability assessments. Finally, the MFBEEF method was proven to be effective using five system reliability analysis examples: a series system, a parallel system, a series–parallel hybrid system, a multi-dimensional series system with multiple failure modes, and an engineering problem with multiple implicit performance functions.  相似文献   

5.
Life data from systems of components are often analysed to estimate the reliability of the individual components. These estimates are useful since they reflect the reliability of the components under actual operating conditions. However, owing to the cost or time involved with failure analysis, the exact component causing system failure may be unknown or ‘masked’. That is, the cause may only be isolated to some subset of the system's components. We present an iterative approach for obtaining component reliability estimates from such data for series systems. The approach is analogous to traditional probability plotting. That is, it involves the fitting of a parametric reliability function to a set of nonparametric reliability estimates (plotting points). We present a numerical example assuming Weibull component life distributions and a two-component series system. In this example we find estimates with only 4 per cent of the computation time required to find comparable MLEs.  相似文献   

6.
This paper proposes a different likelihood formulation within the Bayesian paradigm for parameter estimation of reliability models. Moreover, the assessment of the uncertainties associated with parameters, the goodness of fit, and the model prediction of reliability are included in a systematic framework for better aiding the model selection procedure. Two case studies are appraised to highlight the contributions of the proposed method and demonstrate the differences between the proposed Bayesian formulation and an existing Bayesian formulation.  相似文献   

7.
Quantifying uncertainty during risk analysis has become an important part of effective decision-making and health risk assessment. However, most risk assessment studies struggle with uncertainty analysis and yet uncertainty with respect to model parameter values is of primary importance. Capturing uncertainty in risk assessment is vital in order to perform a sound risk analysis. In this paper, an approach to uncertainty analysis based on the fuzzy set theory and the Monte Carlo simulation is proposed. The question then arises as to how these two modes of representation of uncertainty can be combined for the purpose of estimating risk. The proposed method is applied to a propylene oxide polymerisation reactor. It takes into account both stochastic and epistemic uncertainties in the risk calculation. This study explores areas where random and fuzzy logic models may be applied to improve risk assessment in industrial plants with a dynamic system (change over time). It discusses the methodology and the process involved when using random and fuzzy logic systems for risk management.  相似文献   

8.
In this paper, a competing risk model is proposed to describe the reliability of the cylinder liners of a marine Diesel engine. Cylinder liners presents two dominant failure modes: wear degradation and thermal cracking. The wear process is described through a stochastic process, whereas the failure time due to the thermal cracking is described by the Weibull distribution. The use of the proposed model allows performing goodness-of-fit test and parameters estimation on the basis of both wear and failure data. Moreover, it enables reliability estimates of the state of the liners to be obtained and the hierarchy of the failure mechanisms to be determined for any given age and wear level of the liner. The model has been applied to a real data set: 33 cylinder liners of Sulzer RTA 58 engines, which equip twin ships of the Grimaldi Group. Estimates of the liner reliability and of other quantities of interest under the competing risk model are obtained, as well as the conditional failure probability and mean residual lifetime, given the survival age and the accumulated wear. Furthermore, the model has been used to estimate the probability that a liner fails due to one of the failure modes when both of these modes act.  相似文献   

9.
针对汽车上安装的前照灯具有固定照射范围,当汽车夜间转弯时前照灯无法调节照明角度,常在弯道内侧出现盲区等情况,首先建立了线性二自由度汽车模型、前照灯光轴水平方向调节模型和前照灯步进电机模型,然后提出了基于-维云模型控制的自适应前照灯系统控制算法,建立了汽车弯道行驶自适应前照灯控制系统模型并利用MATLAB进行了仿真.仿真...  相似文献   

10.
Failure modes and effects analysis is a framework that has been widely used to improve reliability by prioritizing failures modes using the so‐called risk priority number. However, the risk priority number has some problems frequently pointed out in literature, namely its non‐injectivity, non‐surjectivity, and the impossibility to give weights to risk variables. Despite these disadvantages, the risk priority number continues to be widely used due to its higher simplicity when compared with other alternatives found in literature. In this paper, we propose a novel risk prioritization model to overcome the major drawbacks of the risk priority number. The model contains 2 functions, the risk isosurface function that prioritizes 3 risk variables considering their order of importance in a given risk scenario, and the risk prioritization index function which prioritizes 3 risk variables considering their weights. The novelty of the proposed model is its injectivity, surjectivity, and ease of use in failure modes prioritization. The performance of the proposed model was analyzed using some examples typically used to discuss the conventional risk priority number shortcomings. The model was applied to a case study and its performance correlated with other risk prioritization models. Results show that the failure modes prioritization reached with the proposed model agrees with the expectations made for the risk scenario.  相似文献   

11.
This article presents a new approach to production regularity assessment in the oil and chemical industries. The production regularity is measured by the throughput capacity distribution. A brief survey of some existing techniques is presented, and the structure of the new approach is introduced. The proposed approach is based on analytical methods, i.e. no simulation is necessary. The system modeling is split into three levels: components, basic subsystems, and merged subsystems, and two modeling methods are utilized: Markov modeling and a rule-based method. The main features of the approach are as follows: (1) short calculation time; (2) systems of dependent components can be modeled; (3) maintenance strategies can be modeled; and (4) a variety of system configurations can be modeled. A simple case study is used to demonstrate how the proposed approach can be applied.  相似文献   

12.
Human error is one of the largest contributing factors to unsafe operation and accidents in high-speed train operation. As a well-known second-generation human reliability analysis (HRA) technique, the cognitive reliability and error analysis method (CREAM) has been introduced to address HRA problems in various fields. Nevertheless, current CREAM models are insufficient to deal with the HRA problem that need to consider the interdependencies between the Common Performance Conditions (CPCs) and determine the weights of these CPCs, simultaneously. Hence, the purpose of this paper is to develop a hybrid HRA model by integrating CREAM, the interval type-2 fuzzy sets, and analytic network process (ANP) to overcome this drawback. Firstly, the interval type-2 fuzzy sets are utilized to express the highly uncertain information of CPCs. Secondly, the ANP is incorporated into the CREAM to depict the interdependencies between the CPCs and determine their weights. Furthermore, human error probability (HEP) can be calculated based on the obtained weights. Finally, an illustrative example of the HRA problem in high-speed train operation is proposed to demonstrate the application and validity of the proposed HRA model. The results indicate that experts prefer to express their preferences by fuzzy sets rather than crisp values, and the interdependences between the CPCs can be better depicted in the proposed model.  相似文献   

13.
This paper proposes a three-phase approach for supplier selection based on the Kano model and fuzzy Multi Criteria Decision-Making. Since the supplier selection problem involves different criteria, quality attributes have been assumed to denote the importance weight of the criteria for supplier selection. Furthermore, to consider the inherent vagueness of human thought, a fuzzy logic has been utilised. Initially, the importance weight of the criteria has been calculated using a fuzzy Kano questionnaire and fuzzy analytic hierarchy process. In the second phase, the Fuzzy TOPSIS technique has been used to screen out in capable suppliers. Finally, in the third phase, the filtered suppliers which are qualified, once again will be evaluated by the same approach for the final ranking. The proposed approach has also been examined in a case study.  相似文献   

14.
在对军工研究所RD人力资本投资风险因素进行识别的基础上,建立风险评价指标体系,发现指标之间存在相互影响与反馈的关系,提出基于网络分析法并同时考虑评估意见模糊性的军工研究所RD人力资本投资风险模糊评估模型,给出相关算法。最后,通过实例说明该方法的分析结果更合理、直观,而且还能为决策者提供更为详细的决策信息。  相似文献   

15.
A new technique for reliability and quality optimization of electronic components and assemblies, the so called in situ accelerated ageing technique with electrical testing, is presented. This technique is extremely useful for the building-in approach to quality and reliability. First, it can be used to optimize an electronic component or assembly with respect to its quality and reliability performance at a very early stage, i.e. at the design level, at the level of materials selection, and at the level of identifying production techniques and defining production parameters. The typical test time is of the order of 24 hours, which is sufficiently short to allow a design of experiments type approach to quality and reliability optimization. Furthermore, the technique is also very useful for obtaining a deeper understanding of the physico-chemical processes which lead to failure. A number of practical examples where the technique has been successfully applied are discussed.  相似文献   

16.
The duration of freeway traffic accidents duration is an important factor, which affects traffic congestion, environmental pollution, and secondary accidents. Among previous studies, the M5P algorithm has been shown to be an effective tool for predicting incident duration. M5P builds a tree-based model, like the traditional classification and regression tree (CART) method, but with multiple linear regression models as its leaves. The problem with M5P for accident duration prediction, however, is that whereas linear regression assumes that the conditional distribution of accident durations is normally distributed, the distribution for a “time-to-an-event” is almost certainly nonsymmetrical. A hazard-based duration model (HBDM) is a better choice for this kind of a “time-to-event” modeling scenario, and given this, HBDMs have been previously applied to analyze and predict traffic accidents duration. Previous research, however, has not yet applied HBDMs for accident duration prediction, in association with clustering or classification of the dataset to minimize data heterogeneity. The current paper proposes a novel approach for accident duration prediction, which improves on the original M5P tree algorithm through the construction of a M5P-HBDM model, in which the leaves of the M5P tree model are HBDMs instead of linear regression models. Such a model offers the advantage of minimizing data heterogeneity through dataset classification, and avoids the need for the incorrect assumption of normality for traffic accident durations. The proposed model was then tested on two freeway accident datasets. For each dataset, the first 500 records were used to train the following three models: (1) an M5P tree; (2) a HBDM; and (3) the proposed M5P-HBDM, and the remainder of data were used for testing. The results show that the proposed M5P-HBDM managed to identify more significant and meaningful variables than either M5P or HBDMs. Moreover, the M5P-HBDM had the lowest overall mean absolute percentage error (MAPE).  相似文献   

17.
This paper discusses an acceptable approach that the US Nuclear Regulatory Commission staff has proposed for using Probabilistic Risk Assessment in making decisions on changes to the licensing basis of a nuclear power plant. First, the overall philosophy of risk-informed decision-making, and the process framework are described. The philosophy is encapsulated in five principles, one of which states that, if the proposed change leads to an increase in core damage frequency or risk, the increases must be small and consistent with the intent of the Nuclear Regulatory Commission's Safety Goal Policy Statement. The second part of the paper discusses the use of PRA to demonstrate that this principle has been met. The discussion focuses on the acceptance guidelines, and on comparison of the PRA results with those guidelines. The difficulties that arise because of limitations in scope and analytical uncertainties are discussed and approaches to accommodate these difficulties in the decision-making are described.  相似文献   

18.
Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, as failures of situation assessment may cause wrong decisions for process control and finally errors of commission in nuclear power plants. A few computational models that can be used to predict and quantify the situation awareness of operators have been suggested. However, these models do not sufficiently consider human characteristics for nuclear power plant operators.In this paper, we propose a computational model for situation assessment of nuclear power plant operators using a Bayesian network. This model incorporates human factors significantly affecting operators’ situation assessment, such as attention, working memory decay, and mental model.As this proposed model provides quantitative results of situation assessment and diagnostic performance, we expect that this model can be used in the design and evaluation of human system interfaces as well as the prediction of situation awareness errors in the human reliability analysis.  相似文献   

19.
This paper presents a similarity-based approach for prognostics of the Remaining Useful Life (RUL) of a system, i.e. the lifetime remaining between the present and the instance when the system can no longer perform its function. Data from failure dynamic scenarios of the system are used to create a library of reference trajectory patterns to failure. Given a failure scenario developing in the system, the remaining time before failure is predicted by comparing by fuzzy similarity analysis its evolution data to the reference trajectory patterns and aggregating their times to failure in a weighted sum which accounts for their similarity to the developing pattern. The prediction on the failure time is dynamically updated as time goes by and measurements of signals representative of the system state are collected. The approach allows for the on-line estimation of the RUL. For illustration, a case study is considered regarding the estimation of RUL in failure scenarios of the Lead Bismuth Eutectic eXperimental Accelerator Driven System (LBE-XADS).  相似文献   

20.
The most commonly used dose–response models implicitly assume that accumulation of dose is a time-independent process where each pathogen has a fixed risk of initiating infection. Immune particle neutralization of pathogens, however, may create strong time dependence; i.e. temporally clustered pathogens have a better chance of overwhelming the immune particles than pathogen exposures that occur at lower levels for longer periods of time. In environmental transmission systems, we expect different routes of transmission to elicit different dose–timing patterns and thus potentially different realizations of risk. We present a dose–response model that captures time dependence in a manner that incorporates the dynamics of initial immune response. We then demonstrate the parameter estimation of our model in a dose–response survival analysis using empirical time-series data of inhalational anthrax in monkeys in which we find slight dose–timing effects. Future dose–response experiments should include varying the time pattern of exposure in addition to varying the total doses delivered. Ultimately, the dynamic dose–response paradigm presented here will improve modelling of environmental transmission systems where different systems have different time patterns of exposure.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号