首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi-fidelity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. From simulation results and a real example using finite element analysis, our method outperforms the expected improvement (EI) criterion that works for single-accuracy experiments. Supplementary materials for this article are available online.  相似文献   

2.
This article presents a framework for simulation-based design optimization of computationally expensive problems, where economizing the generation of sample designs is highly desirable. One popular approach for such problems is efficient global optimization (EGO), where an initial set of design samples is used to construct a kriging model, which is then used to generate new ‘infill’ sample designs at regions of the search space where there is high expectancy of improvement. This article attempts to address one of the limitations of EGO, where generation of infill samples can become a difficult optimization problem in its own right, as well as allow the generation of multiple samples at a time in order to take advantage of parallel computing in the evaluation of the new samples. The proposed approach is tested on analytical functions, and then applied to the vehicle crashworthiness design of a full Geo Metro model undergoing frontal crash conditions.  相似文献   

3.
Constrained blackbox optimization is a difficult problem, with most approaches coming from the mathematical programming literature. The statistical literature is sparse, especially in addressing problems with nontrivial constraints. This situation is unfortunate because statistical methods have many attractive properties: global scope, handling noisy objectives, sensitivity analysis, and so forth. To narrow that gap, we propose a combination of response surface modeling, expected improvement, and the augmented Lagrangian numerical optimization framework. This hybrid approach allows the statistical model to think globally and the augmented Lagrangian to act locally. We focus on problems where the constraints are the primary bottleneck, requiring expensive simulation to evaluate and substantial modeling effort to map out. In that context, our hybridization presents a simple yet effective solution that allows existing objective-oriented statistical approaches, like those based on Gaussian process surrogates and expected improvement heuristics, to be applied to the constrained setting with minor modification. This work is motivated by a challenging, real-data benchmark problem from hydrology where, even with a simple linear objective function, learning a nontrivial valid region complicates the search for a global minimum. Supplementary materials for this article are available online.  相似文献   

4.
Kai Yang  Yanfei Lan 《工程优选》2016,48(4):629-651
This article investigates an incentive contract design problem for a project manager who operates a project consisting of multiple tasks performed sequentially by different subcontractors in which all task completion times are uncertain and described by fuzzy variables. On the basis of an expected value criterion and a critical value criterion, two classes of fuzzy bilevel programming models are developed. In the case where the uncertain task completion times are mutually independent, each model can first be decomposed into multiple equivalent sub-models by taking advantage of the structural characteristics, and then a two-step optimization method is employed to derive the optimal incentive contract in each sub-model. In a more general case where the uncertain task completion times are correlative, the approximation approach (AA) technique is adopted first in order to evaluate the objective functions involving fuzzy parameters, which are usually difficult to convert into their crisp equivalents. Then, an AA-based hybrid genetic algorithm integrated with the golden search method and variable neighbourhood search is designed to solve the proposed fuzzy bilevel programming models. Finally, a numerical example of a construction project is conducted to demonstrate the modelling idea and the effectiveness of the proposed methods.  相似文献   

5.
Continuous quality improvement in micro-manufacturing processes relies on optimization strategies that relate an output performance to a set of machining parameters. However, when determining the optimal machining parameters in a micro-manufacturing process, the economics of continuous quality improvement and decision makers’ preference information are typically neglected. This article proposes an economic continuous improvement strategy based on an interval programming model. The proposed strategy differs from previous studies in two ways. First, an interval programming model is proposed to measure the quality level, where decision makers’ preference information is considered in order to determine the weight of location and dispersion effects. Second, the proposed strategy is a more flexible approach since it considers the trade-off between the quality level and the associated costs, and leaves engineers a larger decision space through adjusting the quality level. The proposed strategy is compared with its conventional counterparts using an Nd:YLF laser beam micro-drilling process.  相似文献   

6.
Residual‐based control charts for autocorrelated processes are known to be sensitive to time series modeling errors, which can seriously inflate the false alarm rate. This paper presents a design approach for a residual‐based exponentially weighted moving average (EWMA) chart that mitigates this problem by modifying the control limits based on the level of model uncertainty. Using a Bayesian analysis, we derive the approximate expected variance of the EWMA statistic, where the expectation is with respect to the posterior distribution of the unknown model parameters. The result is a relatively clean expression for the expected variance as a function of the estimated parameters and their covariance matrix. We use control limits proportional to the square root of the expected variance. We compare our approach to two other approaches for designing robust residual‐based EWMA charts and argue that our approach generally results in a more appropriate widening of the control limits. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

7.
In order to meet strict customer demands in a global highly-complex industrial sector, it is necessary to design manufacturing processes based on a clear understanding of the customer's requirements and usage of a product, by translating this knowledge into the process parameter design. This paper presents an integrative, general and intelligent approach to the multi-response process design, based on Taguchi's method, multivariate statistical methods and artificial intelligence techniques. The proposed model considers process design in a general case where analytical relations and interdependency in a process are unknown, thus making it applicable to various types of processes, and incorporates customer demands for several (possible correlated) characteristics of a product. The implementation of the suggested approach is presented on a study that discusses the design of a thermosonic copper wire bonding process in the semiconductor industry, for assembly of microelectronic devices used in automotive applications. The results confirm the effectiveness of the approach in the presence of different types of correlated product quality characteristics.  相似文献   

8.
This paper addresses the problem of designing Make-to-Order (MTO) driven supply networks as is faced by producers of industrial goods. A major challenge in MTO network design is to estimate the operational performance of candidate networks. In particular, the stochastic and dynamic nature of order arrivals and fulfilment processes as well as the need to design a network that enables a timely delivery of ordered products complicate the decision-making. In this paper, a solution approach is presented where simulation is used for assessing the operational performance of candidate networks. The proposed simulation model captures multiple sources of uncertainties and incorporates fundamental control policies for reflecting the autonomous decision-making processes of operational planners. A Variable Neighbourhood Search (VNS) method is presented to guide the search for good network designs. Experiments are conducted on a set of multi-stage networks, where complex products are manufactured in MTO fashion and delivered to customers within a promised order lead time. The results show that our approach effectively produces supply networks that are able to cope with challenges arising from a strong customer orientation.  相似文献   

9.
基于Kriging 代理模型提出了一种同时考虑预测响应值及其不确定性的多点加点准则,并基于该准则发展了一套序列近似优化方法。多点加点准则基于初始样本信息和所预测的对象函数特征增加新样本集,以在寻优迭代过程中自适应地提高代理模型的精度。该文方法依据多点加点准则在一次迭代中增加多个空间无关的新样本点,适用于多机同时计算或并行计算,从而提高计算效率。以两个经典的数学函数为例,将该优化方法与期望提高准则方法进行了比较,结果表明该文提出的优化方法能够有效地提高最优解的全局性。将方法用于一盒式注塑件的成型工艺优化设计,优化结果也表明了该方法的有效性。  相似文献   

10.
A unique method for the integration of process planning and scheduling in a batch-manufacturing environment is reported. This integration is essential for the optimum use of production resources and for the generation of realistic process plans that can be readily executed with little modification. The integration problem is modelled at two levels: process planning and scheduling, which are linked by an intelligent facilitator. The process-planning module employs an optimization approach in which the whole plan solution space in terms of available machines, tools, tool accessibility and precedence constraints is first generated and a search algorithm is then used to find the optimal plan. For a given set of jobs, the scheduling module takes the optimal plans for each job and generates a schedule based on a given criterion, as well as the performance parameters (machine utilization and number of tardy jobs). An unsatisfied performance parameter is fed back to the facilitator, which then identifies a particular job and issues a change to its process plan solution space. The iteration of process Planning -scheduling-solution space modification continues until a schedule is satisfactory or until no further improvement can be made. The uniqueness of this approach is characterized by the flexibility of the process-planning strategy and the intelligent facilitator, which makes full use of the plan solution space intuitively to reach a satisfactory schedule. (It may not be the optimal, though.) The integrated system was implemented in the manufacturing of prismatic parts. The testing results show that the developed integration method can achieve satisfactory process plans and a schedule in an effective and efficient manner.  相似文献   

11.
A novel Bayesian design support tool is empirically investigated for its potential to support the early design stages. The design support tool provides dynamic guidance with the use of morphological design matrices during the conceptual or preliminary design stages. This paper tests the appropriateness of adopting a stochastic approach for supporting the early design phase. The rationale for the stochastic approach is based on the uncertain nature of the design during this part of the design process. The support tool is based on Bayesian belief networks (BBNs) and uses a simple but effective information content–based metric to learn or induce the model structure. The dynamically interactive tool is assessed with two empirical trials. First, the laboratory-based trial with novice designers illustrates a novel emergent design search methodology. Second, the industrial-based trial with expert designers illustrates the hurdles that are faced when deploying a design support tool in a highly pressurised industrial environment. The conclusion from these trials is that there is a need for designers to better understand the stochastic methodology for them to both be able to interpret and trust the BBN model of the design domain. Further, there is a need for a lightweight domain-specific front end interface is needed to enable a better fit between the generic support tool and the domain-specific design process and associated tools.  相似文献   

12.
A common quality improvement strategy used by manufacturers is to periodically allocate quality improvement targets among their suppliers. We propose a formal modelling and optimization approach for assessing quality improvement targets for suppliers. In this approach it is understood that a manufacturer's quality improvement results from reductions in supplier process variances, which occurs only through investments in learning. A constrained nonlinear optimization model is developed for determining an optimal allocation of variance reduction target that minimizes expected total cost, where the relationship between performance measures and the set of design parameters is generally represented by second-order polynomial functions. An example in the fabrication of a tyre tread compound is used both to demonstrate the implementation of our proposed models as well as to provide an empirical comparison of optimal learning rates for different functional relationships between the performance measures and the set of design parameters.  相似文献   

13.
The design and optimization of both sheet metal formed parts and processes are nowadays carried out virtually making use of numerical tools by finite element analysis. Such virtual try-out approach contributes with significant savings in terms of money, time and effort in the design, production and process set-up of deep drawn parts. The analysis of either forming success operation or surface defects, in each of the development phases, is generally performed by means of the material’s forming limit diagram (FLD), since it allows to define a safe region that reduces the probability of: (i) necking; (ii) wrinkling and (iii) large deformation occurrence. However, the FLD represented in the strain space is known to present some disadvantages. To overcome this problem, Ito and Goya proposed a local bifurcation criterion that defines the critical state for a local bifurcation to set in as a function of the stress level to work-hardening rate ratio, leading to a FLD represented in the stress space. This suggests that the FLD obtained is completely objective in the sense that it is completely independent of the strain or stress history paths (Ito et al. 2000). In this work the Ito and Goya model is used to evaluate formability, as well as fracture mode and direction on the deep drawing of a square cup. Since the analysis is performed based on the stress state, it is also possible to determine an instability factor that “measures” the degree of acceleration by current stress for the local bifurcation mode towards fracture. The selected example highlights the potential use of the criterion which, once combined with the finite element analysis, can undeniably improve the mechanical design of forming processes.  相似文献   

14.
Taguchi's robust design strategy, whose aim is to make processes and products insensitive to factors which are hard or impossible to control (termed noise factors), is an important paradigm for improving products and processes. We present an overview of the strategy and tactics for robust design and demonstrate its usefulness for reliability improvement. Two important components of robust design are a criterion for assessing the effect of the noise factors and experimentation according to specialized experimental plans. Recent criticism of Taguchi's criterion and his analysis of its estimates has led to an alternative approach of modelling the response directly. We give additional reasons for using this response-model approach in the context of reliability improvement. Using the model for the response, appropriate criteria for assessing the effect of the noise factors can then be evaluated. We consider an actual experiment and reanalyse its data to illustrate these ideas and method.  相似文献   

15.
Commercial software packages for production management are characterized by a gap between MRP logic, based on a backward scheduling approach, and finite capacity scheduling, usually based on forward scheduling. In order to partially bridge that gap, we need scheduling algorithms able to meet due dates while keeping WIP and inventory costs low. This leads us to consider job shop scheduling problems characterized by non-regular objective functions; such problems are even more difficult than classical job shop scheduling, and suitable heuristics are needed. One possibility is to consider local search strategies based on the decomposition of the overall problem into sequencing and timing sub-problems. For given job sequences, the optimal timing problem can be solved as a node potential problem on a graph. Since solving the timing problem is a relatively time-consuming task, we need to define a suitable neighbourhood structure to explore the space of job sequences; this can be done by generalizing well-known results for the minimum makespan problem. A related issue is if solving timing problems exactly is really necessary, or if an approximate solution is sufficient; hence, we also consider solving the timing problem approximately by a fast heuristic. We compare different neighbourhood structures, by embedding them within a pure local improvement strategy. Computational experiments show that the overall approach performs better than release/dispatch rules, although the performance improvement depends on the problem characteristics, and that the fast heuristic is quite competitive with the optimal timing approach. On the one hand, these results pave the way to the development of better local search algorithms (based e.g. on tabu search); on the other hand, it is worth noting that the heuristic timing approach, unlike the optimal one, can be extended to cope with the complicating features typical of practical scheduling problems.  相似文献   

16.
In this paper, we take a design-led perspective on the use of computational tools in the aerospace sector. We briefly review the current state-of-the-art in design search and optimization (DSO) as applied to problems from aerospace engineering, focusing on those problems that make heavy use of computational fluid dynamics (CFD). This ranges over issues of representation, optimization problem formulation and computational modelling. We then follow this with a multi-objective, multi-disciplinary example of DSO applied to civil aircraft wing design, an area where this kind of approach is becoming essential for companies to maintain their competitive edge. Our example considers the structure and weight of a transonic civil transport wing, its aerodynamic performance at cruise speed and its manufacturing costs. The goals are low drag and cost while holding weight and structural performance at acceptable levels. The constraints and performance metrics are modelled by a linked series of analysis codes, the most expensive of which is a CFD analysis of the aerodynamics using an Euler code with coupled boundary layer model. Structural strength and weight are assessed using semi-empirical schemes based on typical airframe company practice. Costing is carried out using a newly developed generative approach based on a hierarchical decomposition of the key structural elements of a typical machined and bolted wing-box assembly. To carry out the DSO process in the face of multiple competing goals, a recently developed multi-objective probability of improvement formulation is invoked along with stochastic process response surface models (Krigs). This approach both mitigates the significant run times involved in CFD computation and also provides an elegant way of balancing competing goals while still allowing the deployment of the whole range of single objective optimizers commonly available to design teams.  相似文献   

17.
This paper proposes an automated scanning process of a structured light system for objects without overhangs. The processes for scanning those objects need to plan scanning directions that minimise the missing area on a three-dimensional surface during the scanning process. Thus, the processes require an approach that finds the next scanning direction efficiently in terms of computational costs. This paper develops a scanning simulation approach to meet this requirement. In order to apply the developed approach, the proposed process generates a solution space for candidate-scanning directions, and represents an intermediate 3D model. The developed approach traverses the solution space in a virtual environment and executes virtual scanning for the intermediate 3D model. The virtual scanning result of each candidate-scanning direction is analysed in order to evaluate the contribution for filling missing area. The proposed process defines key scanning directions in the solution space through the iterative execution of the developed approach. The proposed process has been implemented, and applied to the scanning experiments of dental impressions.  相似文献   

18.
Because of the necessity for considering various creative and engineering design criteria, optimal design of an engineering system results in a highly‐constrained multi‐objective optimization problem. Major numerical approaches to such optimal design are to force the problem into a single objective function by introducing unjustifiable additional parameters and solve it using a single‐objective optimization method. Due to its difference from human design in process, the resulting design often becomes completely different from that by a human designer. This paper presents a novel numerical design approach, which resembles the human design process. Similar to the human design process, the approach consists of two steps: (1) search for the solution space of the highly‐constrained multi‐objective optimization problem and (2) derivation of a final design solution from the solution space. Multi‐objective gradient‐based method with Lagrangian multipliers (MOGM‐LM) and centre‐of‐gravity method (CoGM) are further proposed as numerical methods for each step. The proposed approach was first applied to problems with test functions where the exact solutions are known, and results demonstrate that the proposed approach can find robust solutions, which cannot be found by conventional numerical design approaches. The approach was then applied to two practical design problems. Successful design in both the examples concludes that the proposed approach can be used for various design problems that involve both the creative and engineering design criteria. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

19.
A reconfigurable machine tool (RMT) is a special machine that can deliver different machining functions through reconfiguration processes among its configurations during the machine utilisation stage. In this research, a new approach is developed to identify the optimal configurations and the reconfiguration processes for design of the RMTs. In this work, a generic design AND-OR tree is used to model different design solution candidates, their machine configurations and parameters of these configurations. A specific design solution is created from the generic design AND-OR tree through tree-based search and modelled by different machine configurations. For a reconfiguration process between two machine configurations, a generic process AND-OR graph is used to model reconfiguration operation candidates, sequential constraints among operations and operation parameters. A graph-based search is used to generate all feasible reconfiguration process candidates from the generic process AND-OR graph. The optimal design is identified by multi-level and multi-objective hybrid optimisation. A case study is developed to show how this new approach is used for the optimal design of a RMT.  相似文献   

20.
Some amphiphilic molecules in particular environments may self-assemble and originate chemical entities, such as vesicles, which are relevant in technological applications. Experimentation in this field is difficult because of the high dimensionality of the search space and the high cost of each experiment. To tackle the problem of designing a relatively small number of experiments to achieve the relevant information on the problem, we propose an evolutionary design of experiments based on a genetic algorithm. We built a particular algorithm where design and laboratory experimentation interact leading the search toward the optimality region of the space. To get insight in the process we then modelled the experimental results with different classes of regression models; from modelling we could identify the special role played by some molecules and the relevance of their relative weight in the composition. With modelling we “virtually” explored the experimental space and predicted compositions likely to generate very high yields. Models then provide valuable information for the redesign of the experiments and can be considered as an essential addition to the evolutionary approach.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号