首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We consider a stochastic version of the classical multi-item Capacitated Lot-Sizing Problem (CLSP). Demand uncertainty is explicitly modeled through a scenario tree, resulting in a multi-stage mixed-integer stochastic programming model with recourse. We propose a plant-location-based model formulation and a heuristic solution approach based on a fix-and-relax strategy. We report computational experiments to assess not only the viability of the heuristic, but also the advantage (if any) of the stochastic programming model with respect to the considerably simpler deterministic model based on expected value of demand. To this aim we use a simulation architecture, whereby the production plan obtained from the optimization models is applied in a realistic rolling horizon framework, allowing for out-of-sample scenarios and errors in the model of demand uncertainty. We also experiment with different approaches to generate the scenario tree. The results suggest that there is an interplay between different managerial levers to hedge demand uncertainty, i.e. reactive capacity buffers and safety stocks. When there is enough reactive capacity, the ability of the stochastic model to build safety stocks is of little value. When capacity is tightly constrained and the impact of setup times is large, remarkable advantages are obtained by modeling uncertainty explicitly.  相似文献   

2.
We consider structural optimization (SO) under uncertainty formulated as a mathematical game between two players –– a “designer” and “nature”. The first player wants to design a structure that performs optimally, whereas the second player tries to find the worst possible conditions to impose on the structure. Several solution concepts exist for such games, including Stackelberg and Nash equilibria and Pareto optima. Pareto optimality is shown not to be a useful solution concept. Stackelberg and Nash games are, however, both of potential interest, but these concepts are hardly ever discussed in the literature on SO under uncertainty. Based on concrete examples of topology optimization of trusses and finite element-discretized continua under worst-case load uncertainty, we therefore analyze and compare the two solution concepts. In all examples, Stackelberg equilibria exist and can be found numerically, but for some cases we demonstrate nonexistence of Nash equilibria. This motivates a view of the Stackelberg solution concept as the correct one. However, we also demonstrate that existing Nash equilibria can be found using a simple so-called decomposition algorithm, which could be of interest for other instances of SO under uncertainty, where it is difficult to find a numerically efficient Stackelberg formulation.  相似文献   

3.
This paper presents an exact algorithm for the single machine total tardiness problem (1// Σ Ti). We present a new synthesis of various results from the literature which leads to a compact and concise representation of job precedences, a simple optimality check, new decomposition theory, a new lower bound, and a check for presolved subproblems. These are integrated through the use of an equivalence concept that permits a continuous reformation of the data to permit early detection of optimality at the nodes of an enumeration tree. The overall effect is a significant reduction in the size of the search tree, CPU times, and storage requirements. The algorithm is capable of handling much larger problems (e.g., 500 jobs) than its predecessors in the literature (≤ 150). In addition, a simple modification of the algorithm gives a new heuristic which significantly outperforms the best known heuristics in the literature.  相似文献   

4.
We propose an interactive method for decision making under uncertainty, where uncertainty is related to the lack of understanding about consequences of actions. Such situations are typical, for example, in design problems, where a decision maker has to make a decision about a design at a certain moment of time even though the actual consequences of this decision can be possibly seen only many years later. To overcome the difficulty of predicting future events when no probabilities of events are available, our method utilizes groupings of objectives or scenarios to capture different types of future events. Each scenario is modeled as a multiobjective optimization problem to represent different and conflicting objectives associated with the scenarios. We utilize the interactive classification-based multiobjective optimization method NIMBUS for assessing the relative optimality of the current solution in different scenarios. This information can be utilized when considering the next step of the overall solution process. Decision making is performed by giving special attention to individual scenarios. We demonstrate our method with an example in portfolio optimization.  相似文献   

5.
In this paper, we deal with the problem of tactical capacitated production planning with the demand under uncertainty modelled by closed intervals. We propose a single-item with backordering model under small uncertainty in the cumulative demand for the Master Production Scheduling (MPS) problem with different rules, namely the Lot For Lot rule and the Periodic Order Quantity rule. Then we study a general multilevel, multi-item, multi-resource model with backordering and the external demand on components for the Material Requirement Planning (MRP) problem under uncertainty in the cumulative demand. In order to choose robust production plans for the above problems that hedge against uncertainty, we adopt the well-known minmax criterion. We propose polynomial methods for evaluating the impact of uncertainty on a given production plan in terms of its cost and for computing optimal robust production plans for both problems (MPS/MRP) under the assumed interval uncertainty representation. We show in this way that the robust problems (MPS/MRP) under this uncertainty representation are not much computationally harder than their deterministic counterparts.  相似文献   

6.
Recently, we have seen several attempts to establish adequate risk and vulnerability analyses tools and related management frameworks dealing not only with accidental events but also security problems. These attempts have been based on different analysis approaches and using alternative building blocks. In this paper, we discuss some of these and show how a unified framework for such analyses and management tasks can be developed. The framework is based on the use of probability as a measure of uncertainty, as seen through the eyes of the assessor, and define risk as the combination of possible consequences and related uncertainties. Risk and vulnerability characterizations are introduced incorporating ideas both from vulnerability analyses literature as well as from the risk classification scheme introduced by Renn and Klinke.  相似文献   

7.
Stochastic inventory control in multi-echelon systems poses hard problems in optimisation under uncertainty. Stochastic programming can solve small instances optimally, and approximately solve larger instances via scenario reduction techniques, but it cannot handle arbitrary nonlinear constraints or other non-standard features. Simulation optimisation is an alternative approach that has recently been applied to such problems, using policies that require only a few decision variables to be determined. However, to find optimal or near-optimal solutions we must consider exponentially large scenario trees with a corresponding number of decision variables. We propose instead a neuroevolutionary approach: using an artificial neural network to compactly represent the scenario tree, and training the network by a simulation-based evolutionary algorithm. We show experimentally that this method can quickly find high-quality plans using networks of a very simple form.  相似文献   

8.
 Designing chemical processes for the environment requires consideration of several indexes of environmental impact including ozone depletion, global warming potentials, human and aquatic toxicity, photochemical oxidation, and acid rain potentials. Current methodologies, such as the generalized waste reduction algorithm (WAR), provide a first step towards evaluating these impacts. However, to address the issues of accuracy and the relative weights of these impact indexes, one must consider the problem of uncertainties. Environmental impacts must also be weighted and balanced against other concerns, such as their cost and long-term sustainability. These multiple, often conflicting, goals pose a challenging and complex optimization problem, requiring multi-objective optimization under uncertainty. This paper will address the problem of quantifying and analyzing the various objectives involved in process design for the environment. Towards this goal, we proposed a novel multi-objective optimization framework under uncertainty. This framework is based on new and efficient algorithms for multi-objective optimization and for uncertainty analysis. This approach finds a set of potentially optimal designs where trade-offs can be explicitly identified, unlike cost-benefit analysis, which deals with multiple objectives by identifying a single fundamental objective and then converting all the other objectives into this single currency. A benchmark process for hydrodealkylation (HDA) of toluene to produce benzene modeled in the ASPEN simulator is used to illustrate the usefulness of the approach in finding environmentally friendly and cost-effective designs under uncertainty. Received: 8 February 2000 / Accepted: 10 March 2000  相似文献   

9.
The assessment and management of exploited fish and invertebrate populations is subject to several types of uncertainty. This uncertainty translates into risk to the population in the development and implementation of fishery management advice. Here, we define risk as the probability that exploitation rates will exceed a threshold level where long term sustainability of the stock is threatened. We distinguish among several sources of error or uncertainty due to (a) stochasticity in demographic rates and processes, particularly in survival rates during the early fife stages; (b) measurement error resulting from sampling variation in the determination of population parameters or in model estimation; and (c) the lack of complete information on population and ecosystem dynamics. The first represents a form of aleatory uncertainty while the latter two factors represent forms of epistemic uncertainty. To illustrate these points, we evaluate the recent status of the Georges Bank cod stock in a risk assessment framework. Short term stochastic projections are made accounting for uncertainty in population size and for random variability in the number of young surviving to enter the fishery. We show that recent declines in this cod stock can be attributed to exploitation rates that have substantially exceeded sustainable levels.  相似文献   

10.
The literature in economics, finance, operations research, engineering and in general mathematics is first reviewed on the subject of defining uncertainty and risk. The review goes back to 1901. Different perspectives on uncertainty and risk are examined and a new paradigm to model uncertainty and risk is proposed using relevant ideas from this study. This new paradigm is used to represent, aggregate and propagate uncertainty and interpret the resulting variability in a challenge problem developed by Oberkampf et al. [2004, Challenge problems: uncertainty in system response given uncertain parameters. Reliab Eng Syst Safety 2004; 85(1): 11-9]. The challenge problem is further extended into a decision problem that is treated within a multicriteria decision making framework to illustrate how the new paradigm yields optimal decisions under uncertainty. The accompanying risk is defined as the probability of an unsatisfactory system response quantified by a random function of the uncertainty.  相似文献   

11.
This paper describes a solution technique for a general class of problems referred to as aggregate planning and master scheduling problems. The technique is also applicable to multi-item single level capacitated lot sizing problems. The solution technique presented here is a heuristic that is practical for large problems e.g. 9 products and 36 periods. We have tested it for problems with varying number of time periods, number of products, setup costs, holding costs, overtime costs and capacity levels. For those problems that we could solve exactly using a branch and bound algorithm, the solutions produced by the heuristic were all within 1 % of optimality. For problems that we could not solve exactly, we are able to compute a lower bound on the optimal cost. Using the bound we are able to show that our heuristic solutions were within 2.93% of optimality on the average. Except for those problems having very high setup cost or problems with extreme seasonality, the algorithm produced solutions that were within 1 % of optimality on average.  相似文献   

12.
Incorporating outsourcing in scheduling is addressed by several researchers recently. However, this scope is not investigated thoroughly, particularly in the job shop environment. In this paper, a new job shop scheduling problem is studied with the option of jobs outsourcing. The problem objective is to minimise a weighted sum of makespan and total outsourcing cost. With the aim of solving this problem optimally, two solution approaches of combinatorial optimisation problems, i.e. mathematical programming and constraint programming are examined. Furthermore, two problem relaxation approaches are developed to obtain strong lower bounds for some large scale problems for which the optimality is not proven by the applied solution techniques. Using extensive numerical experiments, the performance of the solution approaches is evaluated. Moreover, the effect the objectives's weights in the objective function on the performance of the solution approaches is also investigated. It is concluded that constraint programming outperforms mathematical programming significantly in proving solution optimality, as it can solve small and medium size problems optimally. Moreover, by solving the relaxed problems, one can obtain good lower bounds for optimal solutions even in some large scale problems.  相似文献   

13.
This paper describes a cardinality constrained network flow structure whose special characteristics are used to analyse different risk aspects under an environment of uncertainty. The network structure developed is a suitable alternative to support financial planning and many other decision-making problems with limited resources. By setting a diversification level, we can manage systematic and non-systematic risks under a stochastic mixed integer linear programming framework. A dual decomposition method, Progressive Hedging (PH), is applied to more efficiently accommodate instances with large numbers of scenarios. We studied the impact of the level of the diversification on transaction costs and considered different factors that influence the performance of the algorithm. In particular, a Lagrangian bound is embedded to enhance the capacity of the method. Numerical results show the effectiveness of the proposed decision support approach.  相似文献   

14.
We present a methodical procedure for topology optimization under uncertainty with multiresolution finite element (FE) models. We use our framework in a bifidelity setting where a coarse and a fine mesh corresponding to low- and high-resolution models are available. The inexpensive low-resolution model is used to explore the parameter space and approximate the parameterized high-resolution model and its sensitivity, where parameters are considered in both structural load and stiffness. We provide error bounds for bifidelity FE approximations and their sensitivities and conduct numerical studies to verify these theoretical estimates. We demonstrate our approach on benchmark compliance minimization problems, where we show significant reduction in computational cost for expensive problems such as topology optimization under manufacturing variability, reliability-based topology optimization, and three-dimensional topology optimization while generating almost identical designs to those obtained with a single-resolution mesh. We also compute the parametric von Mises stress for the generated designs via our bifidelity FE approximation and compare them with standard Monte Carlo simulations. The implementation of our algorithm, which extends the well-known 88-line topology optimization code in MATLAB, is provided.  相似文献   

15.
The relative performance of color constancy algorithms is evaluated. We highlight some problems with previous algorithm evaluation and define more appropriate testing procedures. We discuss how best to measure algorithm accuracy on a single image as well as suitable methods for summarizing errors over a set of images. We also discuss how the relative performance of two or more algorithms should best be compared, and we define an experimental framework for testing algorithms. We reevaluate the performance of six color constancy algorithms using the procedures that we set out and show that this leads to a significant change in the conclusions that we draw about relative algorithm performance as compared with those from previous work.  相似文献   

16.
The paper suggests a possible cooperation between stochastic programming and optimal control for the solution of multistage stochastic optimization problems. We propose a decomposition approach for a class of multistage stochastic programming problems in arborescent form (i.e. formulated with implicit non-anticipativity constraints on a scenario tree). The objective function of the problem can be either linear or nonlinear, while we require that the constraints are linear and involve only variables from two adjacent periods (current and lag 1). The approach is built on the following steps. First, reformulate the stochastic programming problem into an optimal control one. Second, apply a discrete version of Pontryagin maximum principle to obtain optimality conditions. Third, discuss and rearrange these conditions to obtain a decomposition that acts both at a time stage level and at a nodal level. To obtain the solution of the original problem we aggregate the solutions of subproblems through an enhanced mean valued fixed point iterative scheme.  相似文献   

17.
We present a robust optimization framework that is applicable to general nonlinear programs (NLP) with uncertain parameters. We focus on design problems with partial differential equations (PDE), which involve high computational cost. Our framework addresses the uncertainty with a deterministic worst-case approach. Since the resulting min–max problem is computationally intractable, we propose an approximate robust formulation that employs quadratic models of the involved functions that can be handled efficiently with standard NLP solvers. We outline numerical methods to build the quadratic models, compute their derivatives, and deal with high-dimensional uncertainties. We apply the presented approach to the parametrized shape optimization of systems that are governed by different kinds of PDE and present numerical results.  相似文献   

18.
In this paper data uncertainty in a specific scenario is modeled with Berkson and Classical errors. With consideration of uncertainty three methods within Bayesian framework are presented to update the failure probability with Beta-Binomial model. It shows that the three methods have their posteriors in the same form of weighted Beta distributions, but the weights are different for each method. Approximation to the mixed posteriors has been proposed and demonstrated by computation results. Moreover, comparison and illustration of the three methods are made based on case study and analytical analysis, which suggest that the LO method with Classical error model be more appropriate in similar applications.  相似文献   

19.
A problem of technical interest is the solution of approximation problems which make a tradeoff between the L 2 norm and the L norm error criteria. This problem is investigated in the framework of filter design with respect to two conflicting optimality goals. The particular interest in L 2-L norm compromise filters has been raised by a paper of Adams (IEEE Trans. on Circuits and Systems vol. 39, pp. 376–388, 1991), who suggested to compute such FIR filters by solution of certain constrained L 2 approximation problems which require a proper choice of weights. It is shown in this paper that bicriterial filter design problems can be approached by classical methods from multicriteria optimization and that especially reference point approximation with the ideal point as reference point is a suitable tool to deal with Adams' problem. Solutions from this latter approach do especially not depend on the choice of weights and yield the best possible compromise filters with respect to a prescribed measure. The resulting optimization problems can be solved with (semi-infinite) programming methods having proven convergence under standard assumptions. Examples of L 2-L norm compromise designs of a linear-phase FIR and an IIR filter are presented.  相似文献   

20.
《工程优选》2012,44(1):37-52
ABSTRACT

This article addresses proportionate flowshop scheduling problems with position-dependent weights wherein the weight is not related to the job but to the position in which the job is scheduled. Common and slack due date assignment models are discussed under a due date assignment framework. The goal is to determine a feasible schedule for all jobs and due dates of all jobs to minimize a total cost function wherein the objective function is of the minsum type. Optimal properties for the problems are proposed, based on which polynomial time algorithms are provided to solve these two problems optimally.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号