首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made.This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network.This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.  相似文献   

2.
Optimization of testing and maintenance activities performed in the different systems of a complex industrial plant is of great interest as the plant availability and economy strongly depend on the maintenance activities planned. Traditionally, two types of models, i.e. deterministic and probabilistic, have been considered to simulate the impact of testing and maintenance activities on equipment unavailability and the cost involved. Both models present uncertainties that are often categorized as either aleatory or epistemic uncertainties. The second group applies when there is limited knowledge on the proper model to represent a problem, and/or the values associated to the model parameters, so the results of the calculation performed with them incorporate uncertainty. This paper addresses the problem of testing and maintenance optimization based on unavailability and cost criteria and considering epistemic uncertainty in the imperfect maintenance modelling. It is framed as a multiple criteria decision making problem where unavailability and cost act as uncertain and conflicting decision criteria. A tolerance interval based approach is used to address uncertainty with regard to effectiveness parameter and imperfect maintenance model embedded within a multiple-objective genetic algorithm. A case of application for a stand-by safety related system of a nuclear power plant is presented. The results obtained in this application show the importance of considering uncertainties in the modelling of imperfect maintenance, as the optimal solutions found are associated with a large uncertainty that influences the final decision making depending on, for example, if the decision maker is risk averse or risk neutral.  相似文献   

3.
Computational simulation methods have advanced to a point where simulation can contribute substantially in many areas of systems analysis. One research challenge that has accompanied this transition involves the characterization of uncertainty in both computer model inputs and the resulting system response. This article addresses a subset of the ‘challenge problems’ posed in [Challenge problems: uncertainty in system response given uncertain parameters, 2001] where uncertainty or information is specified over intervals of the input parameters and inferences based on the response are required. The emphasis of the article is to describe and illustrate a method for performing tasks associated with this type of modeling ‘economically’-requiring relatively few evaluations of the system to get a precise estimate of the response. This ‘response-modeling approach’ is used to approximate a probability distribution for the system response. The distribution is then used: (1) to make inferences concerning probabilities associated with response intervals and (2) to guide in determining further, informative, system evaluations to perform.  相似文献   

4.
5.
The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation.  相似文献   

6.
基于集合属性和优先度的D-S证据决策方法   总被引:2,自引:1,他引:1  
为了降低利用D-S证据组合结果进行决策的风险,提出了一种基于集合属性和优先度的证据决策方法,将决策问题分解成精细信度区间的构造和比较两个层面.在构造层面引入集合的不确定性测度和焦元间属性支持度,获取基元命题的精细信度区间值;在比较层面引入优先度评价各命题的精细信度区间,然后在优先度排序的基础上构建了证据决策模型.该方法的特点在于以区间值作为决策依据,充分利用了信度区间所蕴含的信息.实验分析结果表明,该方法是合理、有效的,克服了其它单点值D-S证据决策方法所存在的误决策或不做决策的问题.  相似文献   

7.
In practice, risk and uncertainty are essentially unavoidable in many regulation processes. Regulators frequently face a risk-benefit trade-off since zero risk is neither practicable nor affordable. Although it is accepted that cost–benefit analysis is important in many scenarios of risk management, what role it should play in a decision process is still controversial. One criticism of cost–benefit analysis is that decision makers should consider marginal benefits and costs, not present ones, in their decision making. In this paper, we investigate the problem of regulatory decision making under risk by applying expected utility theory and present a new approach of cost–benefit analysis. Directly taking into consideration the reduction of the risks, this approach achieves marginal cost–benefit analysis. By applying this approach, the optimal regulatory decision that maximizes the marginal benefit of risk reduction can be considered. This provides a transparent and reasonable criterion for stakeholders involved in the regulatory activity. An example of evaluating seismic retrofitting alternatives is provided to demonstrate the potential of the proposed approach.  相似文献   

8.
The decision as to whether a contaminated site poses a threat to human health and should be cleaned up relies increasingly upon the use of risk assessment models. However, the more sophisticated risk assessment models become, the greater the concern with the uncertainty in, and thus the credibility of, risk assessment. In particular, when there are several equally plausible models, decision makers are confused by model uncertainty and perplexed as to which model should be chosen for making decisions objectively. When the correctness of different models is not easily judged after objective analysis has been conducted, the cost incurred during the processes of risk assessment has to be considered in order to make an efficient decision. In order to support an efficient and objective remediation decision, this study develops a methodology to cost the least required reduction of uncertainty and to use the cost measure in the selection of candidate models. The focus is on identifying the efforts involved in reducing the input uncertainty to the point at which the uncertainty would not hinder the decision in each equally plausible model. First, this methodology combines a nested Monte Carlo simulation, rank correlation coefficients, and explicit decision criteria to identify key uncertain inputs that would influence the decision in order to reduce input uncertainty. This methodology then calculates the cost of required reduction of input uncertainty in each model by convergence ratio, which measures the needed convergence level of each key input's spread. Finally, the most appropriate model can be selected based on the convergence ratio and cost. A case of a contaminated site is used to demonstrate the methodology.  相似文献   

9.
This paper addresses the concept of model uncertainty within the context of risk analysis. Though model uncertainty is a topic widely discussed in the risk analysis literature, no consensus seems to exist on its meaning, how it should be measured, or its impact on the application of analysis results in decision processes. The purpose of this paper is to contribute to clarification. The first parts of the paper look into the contents of the two terms ‘model’ and ‘uncertainty’. On this platform it is discussed how focus on model uncertainty merely leads to muddling up the message of the analysis, if risk is interpreted as a true, inherent property of the system, to be estimated in the risk analysis. An alternative approach is to see the models as means for expressing uncertainty regarding the system performance. In this case, it is argued, the term ‘model uncertainty’ loses its meaning.  相似文献   

10.
This paper focuses on manufacturing environments where job processing times are uncertain. In these settings, scheduling decision makers are exposed to the risk that an optimal schedule with respect to a deterministic or stochastic model will perform poorly when evaluated relative to actual processing times. Since the quality of scheduling decisions is frequently judged as if processing times were known a priori, robust scheduling, i.e., determining a schedule whose performance (compared to the associated optimal schedule) is relatively insensitive to the potential realizations of job processing times, provides a reasonable mechanism for hedging against the prevailing processing time uncertainty. In this paper we focus on a two-machine flow shop environment in which the processing times of jobs are uncertain and the performance measure of interest is system makespan. We present a measure of schedule robustness that explicitly considers the risk of poor system performance over all potential realizations of job processing times. We discuss two alternative frameworks for structuring processing time uncertainty. For each case, we define the robust scheduling problem, establish problem complexity, discuss properties of robust schedules, and develop exact and heuristic solution approaches. Computational results indicate that robust schedules provide effective hedges against processing time uncertainty while maintaining excellent expected makespan performance  相似文献   

11.
This paper discuss several quantitative issues that arise in the analysis of health risks, beginning with principles such as de minimis and zero-risk. The paper also provides a probabilistic definition of risk in terms of hazard, context, consequence, magnitude, and uncertainty. The example relies on this definition to investigate, through sensitivity analysis, the effect that uncertainty has on the results obtained. The results, from a case study based on waterborne total arsenic, show that the choice of dose—response functions causes more uncertainty than any other component of risk analysis. Chemical carcinogenesis provides the basis for discussing inability to know as well as uncertainty. The conclusion is that risk analysis keeps uncertainty and inability to know separate; through this function, it provides a much needed method to present information to decision makers and the public.  相似文献   

12.
This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions.  相似文献   

13.
赵涛  宗玛利 《工业工程》2012,15(5):105-111
供应链期权契约是应对市场需求不确定性的一种重要途径,然而期权价格又给供应链带来了新的风险。针对供应链期权契约的风险分担问题,提出了根据谈判能力协商确定期权价格从而达到风险分担的方法。在市场不确定条件下,以单个制造商和单个零售商组成的供应链为研究对象,建立了基于谈判能力的供应链期权契约风险分担模型,分析了谈判能力对供应链订购量、生产安排以及期望收益的影响。研究发现,期权契约可以提高供应链各成员的期望收益,随着制造商谈判能力的增强,零售商的订单数量增加,期权数量减少,制造商的谈判能力降低了供应链的总期望收益。通过数值仿真分析,进一步验证了通过谈判分担期权契约风险的有效性,获得了对制定供应链期权价格具有指导意义的研究结论。   相似文献   

14.
The Epistemic Uncertainty Project of Sandia National Laboratories (NM, USA) proposed two challenge problems intended to assess the applicability and the relevant merits of modern mathematical theories of uncertainty in reliability engineering and risk analysis. This paper proposes a solution to Problem B: the response of a mechanical system with uncertain parameters. Random Set Theory is used to cope with both imprecision and dissonance affecting the available information. Imprecision results in an envelope of CDFs of the system response bounded by an upper CDF and a lower CDF. Different types of parameter discretizations are introduced. It is shown that: (i) when the system response presents extrema in the range of parameters considered, it is better to increase the fineness of the discretization than to invoke a global optimization tool; (ii) the response expectation differed by less than 0.5% when the number of function calls was increased 15.7 times; (iii) larger differences (4–5%) were obtained for the lower tails of the CDFs of the response. Further research is necessary to investigate (i) parameter discretizations aimed at increasing the accuracy of the CDFs (lower) tails; (ii) the role of correlation in combining information.  相似文献   

15.
Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored.  相似文献   

16.
A robust design optimization (RDO) approach for minimum weight and safe shell composite structures with minimal variability into design constraints under uncertainties is proposed. A new concept of feasibility robustness associated to the variability of design constraints is considered. So, the feasibility robustness is defined through the determinant of variance–covariance matrix of constraint functions introducing in this way the joint effects of the uncertainty propagations on structural response. A new framework considering aleatory uncertainty into RDO of composite structures is proposed. So, three classes of variables and parameters are identified: deterministic design variables, random design variables and random parameters. The bi-objective optimization search is performed using on a new approach based on two levels of dominance denoted by Co-Dominance-based Genetic Algorithm (CoDGA). The use of evolutionary concepts together sensitivity analysis based on adjoint variable method is a new proposal. The examples with different sources of uncertainty show that the Pareto front definition depends on random design variables and/or random parameters considered in RDO. Furthermore, the importance to control the uncertainties on the feasibility of constraints is demonstrated. CoDGA approach is a powerfully tool to help designers to make decision establishing the priorities between performance and robustness.  相似文献   

17.
The industrial product-service system for Computer Numerical Control machine tool (mt-iPSS) has drawn much interest. Under the new paradigm of functional result-oriented mt-iPSS, mt-iPSS customer (i.e. owner of the workshop) pays for time or results of mt-iPSS providers. The present problem for mt-iPSS customer is how to timely identify the optimal machine tools, sequence and cutting parameters of operation to finish the jobs while mt-iPSS providers try to maximise their benefit in a non-cooperative game structure. In this paper, a Stackelberg game model is put forward to solve the coordination problem based on the costing of different job shop scheduling solutions under the result-oriented mt-iPSS paradigm. Then, to solve the established bi-level programming model of the Stackelberg game, a solution procedure based on hierarchical particle swarm optimisation is proposed. Finally, a case from a printing machinery enterprise is analysed to validate the proposed model. This research is expected to improve the quality and effectiveness of coordination for scheduling and process planning decision between mt-iPSS customer and multi-providers.  相似文献   

18.
Uncertainty modeling and decision support   总被引:4,自引:0,他引:4  
We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster–Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster–Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster– Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function.  相似文献   

19.
Strategic technology investment under uncertainty   总被引:7,自引:0,他引:7  
In this paper the technology investment decision of a firm is analyzed, while competition on the output market is explicitly taken into account. Technology choice is irreversible and the firms face a stochastic innovation process with uncertainty about the speed of arrival of new technologies. The innovation process is exogenous to the firms. For reasons of market saturation and the fact that more modern technologies are invented as time passes, the demand for a given technology decreases over time. This implies that also the sunk cost investment of each technology decreases over time. The investment decision problem is transformed into a timing game, in which the waiting curve is introduced as a new concopt. An algorithm is designed for solving this (more) general timing game. The algorithm is applied to an information technology investment problem. The most likely outcome exhibits diffusion with equfal payoffs for the firms. Received: December 16, 1999 / Accepted: February 7, 2001  相似文献   

20.
Abstract:

Enterprise risk management (ERM) has emerged as the new paradigm in risk management with the goal of holistically managing all risks facing an enterprise. Yet organizations still manage risks in a piece-meal fashion and struggle to effectively implement ERM and manage complex strategic risks. This article proposes a solution to this problem: ERM implementation using a system dynamics approach, which enables integrating risks in a causal modeling environment that includes feedback and delays. The methodology is then described using the ISO 31000 Risk Management Standard and illustrated using an example.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号