首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Karl Mosler 《OR Spectrum》1991,13(2):87-94
Summary The expected utility analysis of decision under risk needs information on the alternatives and on the decision maker's preferences which in many practical situations are difficult to obtain. This paper presents a procedure for choosing between multiattribute risky alternatives when the probabilities of outcomes are known, the utility function is general multilinear (i.e., can be decomposed into sums and products of univariate utility functions), and there is some partial information on univariate utilities (viz. increasingness) and arbitrary partial information on the scaling coefficients. Pairwise comparisons in the set of alternatives yield a subset which is efficient under the given partial information. Additive and multiplicative utility functions are special cases of the multilinear one. The paper gives particular attention to linear partial information (LPI) on coefficients, which is obtained by standard assessment procedures. The approach can be combined with dominance procedures which use other partial information as LPI on probabilities.  相似文献   

2.
This paper evaluates the Pugh Controlled Convergence method and its relationship to recent developments in design theory. Computer executable models are proposed simulating a team of people involved in iterated cycles of evaluation, ideation, and investigation. The models suggest that: (1) convergence of the set of design concepts is facilitated by the selection of a strong datum concept; (2) iterated use of an evaluation matrix can facilitate convergence of expert opinion, especially if used to plan investigations conducted between matrix runs; and (3) ideation stimulated by the Pugh matrices can provide large benefits both by improving the set of alternatives and by facilitating convergence. As a basis of comparison, alternatives to Pugh’s methods were assessed such as using a single summary criterion or using a Borda count. These models suggest that Pugh’s method, under a substantial range of assumptions, results in better design outcomes than those from these alternative procedures.  相似文献   

3.
In this paper a practical approach to design, based on the concept of selection, is presented. The approach involves: first, the generation of alternative concepts using ideation tecniques; second, the selection of the ‘most-likely-to-succeed’ concepts for further development into feasible alternatives; third, the formulation and solution of selection-decision-support problems to rank the feasible alternatives in order of preference using multiple attributes.The method presented in this paper is a combination of the methods proposed by Pugh and by Mistree and Muster. The former method is appropriate for use in concept selection, which is characterized by many alternatives and essentially insight-based ‘soft’ information. The latter method is appropriate when there are few alternatives and a mix of science-based ‘hard’ and insight-based ‘soft’ information. The method presented by Mistree and Muster is therefore used to formulate and solve the selection-decision-support problem. The design example used in this paper is a modified version of that used by Pugh.  相似文献   

4.
The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol’ sequences and Bucher’s design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.  相似文献   

5.
Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM.  相似文献   

6.
《TEST》1990,5(1):1-60
Summary In Bayesian inference and decision analysis, inferences and predictions are inherently probabilistic in nature. Scoring rules, which involve the computation of a score based on probability forecasts and what actually occurs, can be used to evaluate probabilities and to provide appropriate incentives for “good” probabilities. This paper review scoring rules and some related measures for evaluating probabilities, including decompositions of scoring rules and attributes of “goodness” of probabilites, comparability of scores, and the design of scoring rules for specific inferential and decision-making problems Read before the Spanish Statistical Society at a meeting organized by the Universitat de València on Tuesday, April 23, 1996  相似文献   

7.
8.
This article proposes a reliability‐based design optimization methodology by incorporating probabilistic degradation in the fatigue resistance of material. The probabilistic damage accumulation is treated as a measure of degradation in the fatigue resistance of material and modeled as nonstationary probabilistic process to capture the time‐dependent distribution parameters of damage accumulation. The proposed probabilistic damage accumulation model is then incorporated into reliability‐based design optimization model by building a dynamic reliability model inferred from the stress–strength interference model. The proposed approach facilitates to capture the dynamic degradation behavior while optimizing design variables at an early design stage to improve the overall reliability of product. The applicability of the proposed approach is demonstrated using suitable examples. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

9.
A very general and robust approach to solving optimization problems involving probabilistic uncertainty is through the use of Probabilistic Ordinal Optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the probabilistic merits of local design alternatives, rather than on precise quantification of the alternatives. Thus, we simply ask the question: “Is that alternative better or worse than this one?” to some level of statistical confidence we require, not: “HOW MUCH better or worse is that alternative to this one?”. In this paper we illustrate an elementary application of probabilistic ordinal concepts in a 2-D optimization problem. Two uncertain variables contribute to uncertainty in the response function. We use a simple Coordinate Pattern Search non-gradient-based optimizer to step toward the statistical optimum in the design space. We also discuss more sophisticated implementations, and some of the advantages and disadvantages versus other approaches to optimization under uncertainty.  相似文献   

10.
Great efforts have recently resulted in increasing the environmental performance of industrial products. However, to obtain more sustainable solutions, environmental properties must meet customer requirements. This paper introduces a novel approach for identifying environmental improvement options by taking into account customer preferences. The Life Cycle Assessment methodology is applied to evaluate the environmental profile of a product, using eco-Indicator99 as the impact assessment method. A fuzzy approach based on the House of Quality in the Quality Function Deployment methodology provides a more quantitative method for evaluating the imprecision of the customer preferences. Experimental membership functions have been obtained directly from users and design team members through an opinion poll. The proposed methodology has been applied to identify environmental improvement options by taking into account customer opinions of an office table and identifying three product components where the environmental performance of the table could be improved.  相似文献   

11.
An integrated group decision-making approach to quality function deployment   总被引:2,自引:0,他引:2  
Quality Function Deployment (QFD) is a multi-disciplinary team process in which team member preferences are often in conflict with respect to varied individual objectives. Successful applications of QFD, thus, rely on: (1) effective communication among team members to reach a consensus; (2) assigning importance levels that reflect each individual member's preferences; and (3) mutual interaction of these two factors. No previous paper in the QFD literature has attempted to aggregate team members' opinions in the case where each individual has his or her own criteria. In this study, we consider both agreed criteria, if any, and individual criteria, simultaneously; whereas AHP, MAUT, and others are based only on an agreed set of criteria. Specifically, we modify the nominal group technique to obtain customer requirements, and integrate agreed and individual criteria methods to assign customer's importance levels in general situations where some members in a team have an agreed criteria set while others prefer individual criteria sets. By using voting and linear programming techniques, the proposed approaches consolidate individual preferences into a group consensus in situations starting with or without (partial) agreed criteria sets. This integrated group decision-making system minimizes inconsistency over group and individual preferences and provides preference ordering for alternatives through iterative communication and the resolution of any inconsistencies that exist between the group and individuals, and amongst the individuals themselves.  相似文献   

12.
The selection of the best compromise alternative for treating a product at its end of life (EOL) is presented. Each EOL alternative has its own consequences from an economical, environmental and social point of view. The criteria used to determine these consequences are often contradictory and not equally important. In the presence of multiple conflicting criteria, an optimal EOL alternative rarely exists. Hence, the decision-maker should seek the best compromise EOL alternative. The present paper proposes a multicriteria decision-aid (MCDA) approach to aid the decision-maker in selecting the best compromise EOL alternative on the basis of his/her preferences and the performances of EOL alternatives with respect to the relevant environmental, social and economic criteria. This approach is important because it allows the user to consider various conflicting criteria simultaneously and it takes into account his/her preferences. The paper analyses the most important aspects of this approach such as the constitution of a set of EOL alternatives, the selection of a list of relevant criteria to evaluate the EOL alternatives and the choice of an appropriate multicriteria decision-aid method. A case study is provided to illustrate how the proposed approach can be used for product EOL alternative selection in real-world applications.  相似文献   

13.
In this paper, a new layered cellular manufacturing system is proposed to form dedicated, shared and remainder cells to deal with the probabilistic demand, and later its performance is compared with the classical cellular manufacturing system. In the layered cellular design, each family may need more than one cell to cover capacity requirements. The proposed approach for layered cellular design involves five stages: (1) product clustering, (2) identifying number of cells and demand coverage probabilities, (3) determining cell types using the proposed heuristic procedure, (4) performing simulation to determine operating conditions and (5) statistical analysis to pick the best design configuration among layered cellular designs. Simulation and statistical analysis are performed to help identify the best design within and among both layered cellular design and classical cellular design. It was observed that as the number of part families increased, the number of machines needed to process the parts decreased first. Then the number of machines started to increase once again as the number of part families continued to increase. Another observation was that the average flow time and total WIP were not always the lowest when additional machines were used by the system. The last and the most important observation was that the layered cellular system provided much better results than the classical cellular system when high demand fluctuation was observed.  相似文献   

14.
An entropy-based method of analysis consisting in a transition from probabilistic categories to the macro-parameters of the heart rhythm diagram is proposed for the heart rhythm diagram. These macro-parameters include the quantity of information, standard deviation, information entropy, and excess of entropy production. It is found that the heart rhythm diagram linearly accumulates a quantity of information. A distribution diagram of states and a distribution diagram of the scale of states are created. The system regulation–heart–adaptation includes mechanisms of control, regulation, and management.  相似文献   

15.
基于贝叶斯理论的钢筋混凝土框架中节点抗剪强度计算   总被引:1,自引:0,他引:1  
我国现行设计规范中钢筋混凝土框架中节点抗剪强度计算为半经验半理论计算公式,由于试验数据的有限性和钢筋混凝土材料的离散性,规范建议公式缺乏明确的理论模型。该文以贝叶斯动态信息更新思路,根据主观经验信息选定先验模型,将国内外已完成101组钢筋混凝土框架中节点试件的试验结果作为数据库,应用贝叶斯参数估计方法综合两类信息进行推断,建立了基于贝叶斯理论的多元线性钢筋混凝土中节点抗剪强度概率模型,进而采用贝叶斯后验模型参数剔除理论,剔除影响节点抗剪强度的次要因素,对概率模型进行动态更新,得到中节点抗剪强度简化模型,并将抗剪模型计算值与试验值和美国ACI 352R-02中钢筋混凝土节点抗剪强度的计算公式进行了对比。研究表明:贝叶斯方法继承了先验信息的完备性和大量试验数据的准确性,建议的中节点贝叶斯概率抗剪模型与试验值吻合良好,且较规范值能够更准确地预测钢筋混凝土中节点抗剪强度,可用于该类节点的抗剪强度计算。  相似文献   

16.
The linear programming technique for multidimensional analysis of preferences (LINMAP) is the most representative method for handling the multiple criteria decision making (MCDM) problems with respect to the preference information over alternatives. This paper utilizes the main structure of LINMAP to develop a novel hesitant fuzzy mathematical programming technique to handle MCDM problems within the decision environment of hesitant fuzzy elements (HFEs). Considering the hesitancy of the decision maker, both the pair-wise comparison preference information over alternatives and the evaluation information of alternatives with criteria are represented by the HFEs. Based on the incomplete pair-wise preference judgments over alternatives, we propose the concepts of the hesitant fuzzy consistency and inconsistency indices. Furthermore, we construct a hesitant fuzzy mathematical programming model to derive the weights of criteria and the positive-ideal solution. In this hesitant fuzzy programming model both the objective function and partial constraints’ coefficients take the form of HFEs, and an effective approach based on the ranking method of HFEs is further developed to solve the new derived model. To address the incomplete and inconsistent preference structures of criteria weights, we introduce several deviation variables and establish the bi-objective nonlinear programming model. At length, we employ a green supplier selection problem to illustrate the feasibility and applicability of the proposed technique and conduct a comparison analysis to validate its effectiveness.  相似文献   

17.
《国际生产研究杂志》2012,50(1):177-190
We introduce an innovative Information Granulation Entropy method to evaluate third-party logistics providers. Conventional fuzzy evaluation methods are valuable but biased at times. Objective measurements are rational; however its results are often difficult to explain. To take advantage of the strength of both methods, we propose a comprehensive evaluation framework to allow subjective judgment on alternatives, at the same time deriving criteria weights objectively. In the proposed model, experts input fuzzy language to form an evaluation matrix. After defuziffying the matrix, the K-means clustering method is applied to discretise the matrix. An information granulation entropy approach, based on information science theory and data mining technique, is then developed to determine the weights of criteria. Finally, TOPSIS closeness rating method is applied to derive the priorities of alternatives. To demonstrate its validity, we present a real-world application for selecting a third-party logistics provider. The proposed evaluation framework is particularly beneficial when dealing with large-scale, diverse criteria and alternatives.  相似文献   

18.
This study presents an efficient methodology that derives design alternatives and performance criteria for safety functions/systems in commercial nuclear power plants. Determination of the design alternatives and intermediate-level performance criteria is posed as a reliability allocation problem. The reliability allocation is performed in a single step by means of the concept of two-tier noninferior solutions in the objective and risk spaces within the top-level probabilistic safety criteria (PSC). Two kinds of two-tier noninferior solutions are obtained: desirable design alternatives and intolerable intermediate-level PSC of safety functions/systems.The weighted Chebyshev norm (WCN) approach with an improved Metropolis algorithm in simulated annealing is used to find the two-tier noninferior solutions. This is very efficient in searching for the global minimum of the difficult multiobjective optimization problem (MOP) which results from strong nonlinearity of a probabilistic safety assessment (PSA) model and nonconvexity of the problem. The methodology developed in this study can be used as an efficient design tool for desirable safety function/system alternatives and for the determination of intermediate-level performance criteria.The methodology is applied to a realistic streamlined PSA model that is developed based on the PSA results of the Surry Unit 1 nuclear power plant. The methodology developed in this study is very efficient in providing the intolerable intermediate-level PSC and desirable design alternatives of safety functions/systems.  相似文献   

19.
Design decisions often require input from multiple stakeholders or require balancing multiple design requirements. However, leading axiomatic approaches to decision-based design suggest that combining preferences across these elements is virtually guaranteed to result in irrational outcomes. This has led some to conclude that a single “dictator” is required to make design decisions. In contrast, proponents of heuristic approaches observe that aggregate decisions are frequently made in practice, and argue that this widespread usage justifies the value of these heuristics to the engineering design community. This paper demonstrates that these approaches need not be mutually exclusive. Axiomatic approaches can be informed by empirically motivated restrictions on the way that individuals can order their preferences. These restrictions are represented using “anigrafs”—structured relationships between alternatives that are represented using a graph–theoretic formalism. This formalism allows for a computational assessment of the likelihood of irrational outcomes. Simulation results show that even minimal amounts of structure can vastly reduce the likelihood of irrational outcomes at the level of the group, and that slightly stronger restrictions yield probabilities of irrational preferences that never exceed 5%. Next, an empirical case study demonstrates how anigrafs may be extracted from survey data, and a model selection technique is introduced to examine the goodness-of-fit of these anigrafs to preference data. Taken together, these results show how axiomatic consistency can be combined with empirical correspondence to determine the circumstances under which “dictators” are necessary in design decisions.  相似文献   

20.
In engineering design, material alternatives evaluate according to different criteria depending on the objectives of the problem. Performance ratings for different criteria are measured by different units, but in the decision matrix in order to have a valid comparison all the elements must be dimensionless. However, a lot of normalization methods have been developed for cost and benefit criteria, not only there has not been enough attention for engineering design situations in which approaching the target values are desirable but also the available methods have shortcomings. A new version of VIKOR method, which covers all types of criteria with emphasize on compromise solution, is proposed in this paper. The proposed comprehensive version of VIKOR also overcomes the main error of traditional VIKOR by a simpler approach. Suggested method can enhance exactness of material selection results in different applications, especially in biomedical application where the implant materials should possess similar properties to those of human tissues. Five examples are included to illustrate and justify the suggested method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号