首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
    
The ABC method is a well-known approach to classify inventory items into ordered categories, such as A, B and C. As emphasized in the literature, it is reasonable to evaluate the inventory classification problem in the multi-criteria context. From this point of view, it corresponds to a sorting problem where categories are ordered. Here, one important issue is that the weights of the criteria and categorization preferences can change from industry to industry. This requires the analysis of the problem in a specific framework where the decision maker (expert)’s preferences are considered. In this study, the preferences of the decision maker are incorporated into the decision making process in terms of reference items into each class. We apply two utility functions based sorting methods to the problem. We perform an experiment and compare results with other algorithms from the literature.  相似文献   

2.
This study presents a new solution procedure for multiple objective programming (MOP). It applies the concept of the normal boundary intersection (NBI) within the framework of the interactive weighted Tchebycheff procedure (IWTP). The proposed procedure is a collaborative approach to overcome the weak points inherent in both the NBI method and IWTP. In order to control the solution procedure of pinpointing a final solution properly, we parameterized the Pareto-frontier via a set of reference point vectors based on the convex hull of individual maxima (CHIM) instead of using the varying weights of each objective. Using well-distributed reference point vectors, we could identify well-distributed Pareto-optimal solutions, thereby eliminating the IWTP filtering stages and reducing the chance of missing the best compromise solution from the decision maker (DM)'s utility point of view. Moreover, by working with a sequence of progressively smaller subsets of reference point vectors, the DM can identify a final solution at earlier stages than with the IWTP.  相似文献   

3.
In this paper, we consider the problem of placing alternatives that are defined by multiple criteria into preference-ordered categories. We consider a method that estimates an additive utility function and demonstrate that it may misclassify many alternatives even when substantial preference information is obtained from the decision maker (DM) to estimate the function. To resolve this difficulty, we develop an interactive approach. Our approach occasionally requires the DM to place some reference alternatives into categories during the solution process and uses this information to categorize other alternatives. The approach guarantees to place all alternatives correctly for a DM whose preferences are consistent with any additive utility function. We demonstrate that the approach works well using data derived from ranking global MBA programs as well as on several randomly generated problems.  相似文献   

4.
This paper presents a new multiobjective genetic algorithm based on the Tchebycheff scalarizing function, which aims to generate a good approximation of the nondominated solution set of the multiobjective problem. The algorithm performs several stages, each one intended for searching potentially nondominated solutions in a different part of the Pareto front. Pre-defined weight vectors act as pivots to define the weighted-Tchebycheff scalarizing functions used in each stage. Therefore, each stage focuses the search on a specific region, leading to an iterative approximation of the entire nondominated set.  相似文献   

5.
This paper is concerned with an external sorting algorithm with no additional disk space. The proposed algorithm is a hybrid one that uses Quicksort and special merging process in two distinct phases. The algorithm excels in sorting a huge file, which is many times larger than the available memory of the computer. This algorithm creates no extra backup file for manipulating huge records. For this, the algorithm saves huge disk space, which is needed to hold the large file. Also our algorithm switches to special merging process after the first phase that uses Quicksort. This reduces the time complexity and makes the algorithm faster.  相似文献   

6.
A new kind of multiple criteria decision aid (MCDA) problem, multiple criteria classification (MCC), is studied in this paper. Traditional classification methods in MCDA focus on sorting alternatives into groups ordered by preference. MCC is the classification of alternatives into nominal groups, structured by the decision maker (DM), who specifies multiple characteristics for each group. Starting with illustrative examples, the features, definition and structures of MCC are presented, emphasizing criterion and alternative flexibility. Then an analysis procedure is proposed to solve MCC problems systematically. Assuming additive value functions, an optimization model with constraints that incorporate various classification strategies is constructed to solve MCC problems. An application of MCC in water resources planning is carried out and some future extensions are suggested.  相似文献   

7.
We introduce the concept of a representative value function in robust ordinal regression applied to multiple criteria sorting problems. The proposed approach can be seen as an extension of UTADISGMS, a new multiple criteria sorting method that aims at assigning actions to p pre-defined and ordered classes. The preference information supplied by the decision maker (DM) is composed of desired assignments of some reference actions to one or several contiguous classes—they are called assignment examples. The robust ordinal regression builds a set of general additive value functions compatible with the assignment examples and results in two assignments: necessary and possible. The necessary assignment specifies the range of classes to which the action can be assigned considering all compatible value functions simultaneously. The possible assignment specifies, in turn, the range of classes to which the action can be assigned considering any compatible value function individually. In this paper, we propose a way of selecting a representative value function among the set of compatible ones. We identify a few targets which build on results of the robust ordinal regression and could be attained by a representative value function. They concern enhancement of differences between possible assignments of two actions. In this way, the selected function highlights the most stable part of the robust sorting, and can be perceived as representative in the sense of robustness preoccupation. We envisage two possible uses of the representative value function in decision support systems. The first one is an explicit exhibition of the function along with the results of the UTADISGMS method, in order to help the DM to understand the robust sorting. The other is an autonomous use, in order to supply the DM with sorting obtained by an example-based procedure driven by the chosen function. Three case studies illustrating the use of a representative value function in real-world decision problems are presented. One of those studies is devoted to the comparison of the introduced concept of representativeness with alternative procedures for determining a single value function, which we adapted to sorting problems, because they were originally proposed for ranking problems.  相似文献   

8.
An intelligent system for sorting pistachio nut varieties   总被引:1,自引:0,他引:1  
An intelligent pistachio nut sorting system combining acoustic emissions analysis, Principal Component Analysis (PCA) and Multilayer Feedforward Neural Network (MFNN) classifier was developed and tested. To evaluate the performance of the system 3200 pistachio nuts from four native Iranian pistachio nut varieties were used. Each variety was consisted of 400 split-shells and 400 closed-shells nut. The nuts were randomly selected, slide down a chute, inclined 60° above the horizontal, on which nuts slide down to impact a steel plate and their acoustic signals were recorded from the impact. Sound signals in the time-domain are saved for subsequent analysis. The method is based on feature generation by Fast Fourier Transform (FFT), feature reduction by PCA and classification by MFNN. Features such as amplitude, phase and power spectrum of sound signals are computed via a 1024-point FFT. By using PCA more than 98% reduction in the dimension of feature vector is achieved. To find the optimal MFNN classifier, various topologies each having different number of neurons in the hidden layer were designed and evaluated. The best MFNN model had a 40–12–4 structure, that is, a network having one hidden layer with 40 neurons at its input, 12 neurons in the hidden layer and 4 neurons (pistachio varieties) in the output layer. The selection of the optimal model was based on the examination of mean square error, correlation coefficient and correct separation rate (CSR). The CSR or total weighted average in system accuracy for the 40–12–4 structure was 97.5%, that is, only 2.5% of nuts were misclassified.  相似文献   

9.
Thearea-time complexity of VLSI computations is constrained by the flow and the storage of information in the two-dimensional chip. We study here the information exchanged across the boundary of the cells of asquare-tessellation of the layout. When the information exchange is due to thefunctional dependence between variables respectively input and output on opposite sides of a cell boundary, lower bounds are obtained on theAT 2 measure (which subsume bisection bounds as a special case). When information exchange is due to thestorage saturation of the tessellation cells, a new type of lower bound is obtained on theAT measure.In the above arguments, information is essentially viewed as a fluid whose flow is uniquely constrained by the available bandwidth. However, in some computations, the flow is kept below capacity by the necessity to transform information before an output is produced. We call this mechanismcomputational friction and show that it implies lower bounds on theAT/logA measure.Regimes corresponding to each of the three mechanisms described above can appear by varying the problem parameters, as we shall illustrate by analyzing the problem of sortingn keys each ofk bits, for whichAT 2,AT, andAT/logA bounds are derived. Each bound is interesting, since it dominates the other two in a suitable range of key lengths and computations times.This work was supported in part by the National Science Foundation ECS-84-10902, by an IBM predoctoral fellowship, and by the Joint Services Electronics Program under Contract N00014-84-C-0149. A preliminary version was presented at the 19th Conference on Information Sciences and Systems.  相似文献   

10.
An efficient external sorting algorithm with minimal space requirement is presented in this article. The average number of passes over the data is approximately 1 +Ln(N + 1)/4B, whereN is the number of records in the file to be sorted, andB is the buffer size. The external storage requirement is only the file itself, no additional disk space is required. The internal storage requirement is four buffers: two for input, and two for output. The buffer size can be adjusted to the available memory space. A stack of size log2 N is also required.This work was partially supported by a fellowship and grant from Western Michigan University.  相似文献   

11.
We describe a new algorithm for the problem of perfect sorting a signed permutation by reversals. The worst-case time complexity of this algorithm is parameterized by the maximum prime degree d of the strong interval tree, i.e., f(d).nO(1). This improves the best known algorithm which complexity was based on a parameter always larger than or equal to d.  相似文献   

12.
This paper presents an efficient parallel algorithm for sorting N data items on two-dimensional mesh connected-computers with multiple broadcasting (2-MCCMB). The algorithm uses N × N 2/3 processors and takes 0(N 1/3) time, whereas the previous algorithm by Chung-Horng Lung [3] uses N × N processors and takes 0(N l/2) time on 2-MCCMB.  相似文献   

13.
This paper addresses the situation where a group wishes to cooperatively develop a common multicriteria evaluation model to sort actions (projects, candidates) into classes. It is based on an aggregation/disaggregation approach for the ELECTRE TRI method, implemented on the Decision Support System IRIS. We provide a methodology in which the group discusses how to sort some exemplary actions (possibly fictitious ones), instead of discussing what values the model parameters should take. This paper shows how IRIS may be used to help the group to iteratively reach an agreement on how to sort one or a few actions at a time, preserving the consistency of these sorting examples both at the individual level and at the collective level. The computation of information that may guide the discussion among the group members is also suggested. We provide an illustrative example and discuss some paths for future research motivated by this work.  相似文献   

14.
This paper presents a useful method of relating optimism and pessimism to multiple criteria decision analysis in the context of intuitionistic fuzzy sets based on the unipolar bivariate model. We use eight point operators to estimate the adaptational outcome expectations of optimism and/or pessimism and then determine the net predisposition, the aggregated effect of positive and negative evaluations. A series of new net predispositions for bivariate evaluations are proposed for neutrality; for complete, moderate, and rational optimism; for complete, moderate, and rational pessimism; and for complete and moderate optimism-pessimism. The suitability function, which measures the overall evaluation of each alternative, is then presented. Because positive or negative leniency may exist, such that most of the criteria may be assigned unduly high or low ratings, respectively, we introduce deviation variables to mitigate the effects of such ratings on the apparent importance of various criteria. Based on the two objectives of maximal weighted suitability and minimal deviation values, an integrated programming model is used to compute the optimal weights for the criteria and the corresponding degrees of suitability of the alternative rankings. We establish flexible algorithms that incorporate both objective and subjective information to compute the optimal optimistic and pessimistic decisions. The proposed methods are illustrated and discussed using a numerical example, a multi-criteria supplier selection problem. Finally, an empirical study of job choices is conducted to establish the feasibility and applicability of the current method.  相似文献   

15.
16.
In this article we identify a class of two-dimensional knapsack problems with binary weights and related three-criteria unconstrained combinatorial optimization problems that can be solved in polynomial time by greedy algorithms. Starting from the knapsack problem with two equality constraints we show that this problem can be solved efficiently by using an appropriate partitioning of the items with respect to their binary weights. Based on the results for this problem we derive an algorithm for the three-criteria unconstrained combinatorial optimization problem with two binary objectives that explores the connectedness of the set of efficient knapsacks with respect to a combinatorial definition of adjacency. Furthermore, we prove that our approach is asymptotically optimal and provide extensive computational experiments that shows that we can solve the three-criteria problem with up to one million items in less than half an hour. Finally, we derive an efficient algorithm for the two-dimensional knapsack problems with binary constraints that only takes into account the results we obtained for the unconstrained three-criteria problem with binary weights.  相似文献   

17.
In this paper, a modified particle swarm optimization (PSO) algorithm is developed for solving multimodal function optimization problems. The difference between the proposed method and the general PSO is to split up the original single population into several subpopulations according to the order of particles. The best particle within each subpopulation is recorded and then applied into the velocity updating formula to replace the original global best particle in the whole population. To update all particles in each subpopulation, the modified velocity formula is utilized. Based on the idea of multiple subpopulations, for the multimodal function optimization the several optima including the global and local solutions may probably be found by these best particles separately. To show the efficiency of the proposed method, two kinds of function optimizations are provided, including a single modal function optimization and a complex multimodal function optimization. Simulation results will demonstrate the convergence behavior of particles by the number of iterations, and the global and local system solutions are solved by these best particles of subpopulations.  相似文献   

18.
    
In this study, we develop an interactive algorithm for the multiple criteria selection problem that aims to find the most preferred alternative among a set of known alternatives evaluated on multiple criteria. We assume the decision maker (DM) has a quasi-concave value function that represents his/her preferences. The interactive algorithm selects the pairs of alternatives to be asked to the DM based on the estimated likelihood that one alternative is preferred to another. After the DM selects the preferred alternative, a convex cone is generated based on this preference information and the alternatives dominated by the cone are eliminated. Then, the algorithm updates the likelihood information for the unselected pairwise questions. The aim of the algorithm is to detect the most preferred alternative by performing as few pairwise comparisons as possible. We present the algorithm on an illustrative example problem. We also develop a mathematical model that finds the minimum number of questions that can be asked to the DM to determine the most preferred alternative under perfect information. We use the minimum number of questions to develop strategies for interactive algorithm and measure its performance.  相似文献   

19.
An effective incident information management system needs to deal with several challenges. It must support heterogeneous distributed incident data, allow decision makers (DMs) to detect anomalies and extract useful knowledge, assist DMs in evaluating the risks and selecting an appropriate alternative during an incident, and provide differentiated services to satisfy the requirements of different incident management phases. To address these challenges, this paper proposes an incident information management framework that consists of three major components. The first component is a high-level data integration module in which heterogeneous data sources are integrated and presented in a uniform format. The second component is a data mining module that uses data mining methods to identify useful patterns and presents a process to provide differentiated services for pre-incident and post-incident information management. The third component is a multi-criteria decision-making (MCDM) module that utilizes MCDM methods to assess the current situation, find the satisfactory solutions, and take appropriate responses in a timely manner. To validate the proposed framework, this paper conducts a case study on agrometeorological disasters that occurred in China between 1997 and 2001. The case study demonstrates that the combination of data mining and MCDM methods can provide objective and comprehensive assessments of incident risks.  相似文献   

20.
In this article we present a multiple criteria methodology for supporting decisions that concern the selection of equities, on the basis of financial analysis. The ELECTRE Tri outranking classification method is employed for selecting the attractive equities, through the evaluation of the overall corporate performance of the corresponding firms. The crucial importance issue of the industry/sectoral accounting particularities was strongly taken into account. An elaborate review of coherent research studies is also provided. Finally, the validity of the proposed methodology is tested through a large scale application on the Athens Stock Exchange.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号