首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 187 毫秒
1.
In this paper, we introduce two parameter multi-domain ‘ hp’ techniques for the empirical interpolation method (EIM). In both approaches, we construct a partition of the original parameter domain into parameter subdomains: h-refinement. We apply the standard EIM independently within each subdomain to yield local (in parameter) approximation spaces: p-refinement. Further, for a particularly simple case, we introduce a priori convergence theory for the partition procedure. We show through two numerical examples that our approaches provide significant reduction in the EIM approximation space dimension and thus significantly reduce the computational cost associated with EIM approximations. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

2.
Amin Toghi Eshghi 《工程优选》2013,45(12):2011-2029
Reliability-based design optimization (RBDO) requires the evaluation of probabilistic constraints (or reliability), which can be very time consuming. Therefore, a practical solution for efficient reliability analysis is needed. The response surface method (RSM) and dimension reduction (DR) are two well-known approximation methods that construct the probabilistic limit state functions for reliability analysis. This article proposes a new RSM-based approximation approach, named the adaptive improved response surface method (AIRSM), which uses the moving least-squares method in conjunction with a new weight function. AIRSM is tested with two simplified designs of experiments: saturated design and central composite design. Its performance on reliability analysis is compared with DR in terms of efficiency and accuracy in multiple RBDO test problems.  相似文献   

3.
A new technique of approximating design sensitivities of the critical load is presented in this paper. The technique results in stable and reliable estimations of design sensitivities at prebuckling points. Since taking derivatives of an approximated eigenvalue problem gives unstable sensitivities as the point approaches the critical load, the sensitivities are approximated directly from the exact sensitivity expressions. The sensitivities are approximated by applying two common approaches that are used in the critical load estimation and are called ‘one‐ and two‐point approximation’. The reliability and applicability of the proposed technique are demonstrated through several numerical examples of truss and beam structures. Two‐point approximation of design sensitivities gives better results than one‐point approximation. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

4.
In this paper, we introduce a novel approach in the nonconvex optimization framework for image restoration via a Markov random field (MRF) model. While image restoration is elegantly expressed in the language of MRF’s, the resulting energy minimization problem was widely viewed as intractable: it exhibits a highly nonsmooth nonconvex energy function with many local minima, and is known to be NP-hard. The main goal of this paper is to develop fast and scalable approximation optimization approaches to a nonsmooth nonconvex MRF model which corresponds to an MRF with a truncated quadratic (also known as half-quadratic) prior. For this aim, we use the difference of convex functions (DC) programming and DC algorithm (DCA), a fast and robust approach in smooth/nonsmooth nonconvex programming, which have been successfully applied in various fields in recent years. We propose two DC formulations and investigate the two corresponding versions of DCA. Numerical simulations show the efficiency, reliability and robustness of our customized DCAs with respect to the standard GNC algorithm and the Graph-Cut based method—a more recent and efficient approach to image analysis.  相似文献   

5.
Approximate estimation of system reliability via fault trees   总被引:1,自引:0,他引:1  
In this article, we show how fault tree analysis, carried out by means of binary decision diagrams (BDD), is able to approximate reliability of systems made of independent repairable components with a good accuracy and a good efficiency. We consider four algorithms: the Murchland lower bound, the Barlow-Proschan lower bound, the Vesely full approximation and the Vesely asymptotic approximation. For each of these algorithms, we consider an implementation based on the classical minimal cut sets/rare events approach and another one relying on the BDD technology. We present numerical results obtained with both approaches on various examples.  相似文献   

6.
In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach.  相似文献   

7.
Robust design, axiomatic design, and reliability‐based design provide effective approaches to deal with quality problems, and their integration will achieve better quality improvement. An integration design optimization framework of robust design, axiomatic design, and reliability‐based design is proposed in this paper. First, the fitted response model of each quality characteristic is obtained by response surface methodology and the mean square error (MSE) estimation is given by a second‐order Taylor series approximation expansion. Then the multiple quality characteristics robust design model is developed by the MSE criteria. Finally, the independence axiom constraints for decoupling and reliability constraints are integrated into the multiple quality characteristics robust design model, and the integration design optimization framework is formulated, where the weighted Tchebycheff approach is adopted to solve the multiple objective programming. An illustrative example is presented at the end, and the results show that the proposed approach can obtain better trade‐offs among conflicting quality characteristics, variability, coupling degree and reliability requirements. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
A typical reliability-based design optimization (RBDO) problem is usually formulated as a stochastic optimization model where the performance of a system is optimized with the reliability requirements being satisfied. Most existing RBDO methods divide the problem into two sub-problems: one relates to reliability analysis, the other relates to optimization. Traditional approaches nest the two sub-problems with the reliability analysis as the inner loop and the optimization as the outer loop. Such nested approaches face the challenge of prohibitive computational expense that drives recent research focusing on decoupling the two loops or even fundamentally transforming the two-loop structure into one deterministic optimization problem. While promising, the potential issue in these computationally efficient approaches is the lowered accuracy. In this paper, a new decoupled approach, which performs the two loops sequentially, is proposed. First, a deterministic optimization problem is solved to locate the means of the uncertain design variables. After the mean values are determined, the reliability analysis is performed. A new deterministic optimization problem is then restructured with a penalty added to each limit-state function to improve the solution iteratively. Most existing research on decoupled approaches linearizes the limit-state functions or introduces the penalty into the limit-state functions, which may suffer the approximation error. In this research, the penalty term is introduced to change the right hand side (RHS) value of the deterministic constraints. Without linearizing or transforming the formulations of limit-state function, this penalty-based approach effectively improves the accuracy of RBDO. Comparison experiments are conducted to illustrate how the proposed method obtains improved solutions with acceptable computational cost when compared to other RBDO approaches collected from literature.  相似文献   

9.
The robust parameter design of industrial processes and products on the basis of the concept of building quality into a design has attracted much attention from researchers and practitioners for many years, and several methods have been studied in the research community. Dual response surface methodology is one of the most commonly used approaches for simultaneously optimizing the mean and the variance of response in quality engineering. Nevertheless, when the relationship between influential input factors and output quality characteristics of a process is very complex (e.g. highly nonlinear and noisy), traditional approaches have their limitations. In this article, we introduced support vector regression, kriging model, and radial basis function, which are commonly used in computer experiments, into robust parameter design, and especially introduced a new strategy that builds the dual response surface using the ensemble of surrogates, which can provide a more robust approximation model. We demonstrated the advantages of kriging, support vector regression, radial basis function, and the ensemble of surrogates by reinvestigating the dual response approach on the basis of parametric, nonparametric, and semiparametric approaches, and a simulation experiment is studied. The results show that our presented models can achieve more desirable results than parametric, nonparametric, and semiparametric approaches in terms of fitting and predictive accuracy, and the optimal operating conditions recommended by our presented models are similar to those recommended in literature, which indicates the validation of our presented models. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

10.
This paper describes a Monte-Carlo (MC) simulation methodology for estimating the reliability of a multi-state network. The problem under consideration involves multi-state two-terminal reliability (M2TR) computation. Previous approaches have relied on enumeration or on the computation of multi-state minimal cut vectors (MMCV) and the application of inclusion/exclusion formulae. This paper discusses issues related to the reliability calculation process based on MMCV. For large systems with even a relatively small number of component states, reliability computation can become prohibitive or inaccurate using current methods. The major focus of this paper is to present and compare a new MC simulation approach that obtains accurate approximations to the actual M2TR. The methodology uses MC to generate system state vectors. Once a vector is obtained, it is compared to the set of MMCV to determine whether the capacity of the vector satisfies the required demand. Examples are used to illustrate and validate the methodology. The estimates of the simulation approach are compared to exact and approximation procedures from solution quality and computational effort perspectives. Results obtained from the simulation approach show that for relatively large networks, the maximum absolute relative error between the simulation and the actual M2TR is less than 0.9%, yet when considering approximation formulae, this error can be as large as 18.97%. Finally, the paper discusses that the MC approach consistently yields accurate results while the accuracy of the bounding methodologies can be dependant on components that have considerable impact on the system design.  相似文献   

11.
In this paper, we describe (1) the characteristics of ice-water two-phase flow created in a supercooling-type ice storage system, (2) the numerical analysis models for both processes of ice storing and melting within ice storage tanks, and (3) the design technologies for ice storage systems based on such predictive models, as well as the operational performance of actual equipment built with these design technologies. As the result of this technological development, it has been confirmed that the predictive approaches for the one dimensional ice accumulation in the given tanks and ice melting with low temperature water enough to air conditioning applications are reasonable and widely usable. Furthermore by the design conditions obtained here it is possible to simplify the construction of ice storage tanks and to realize high reliability in the actual equipment.  相似文献   

12.
Zhangli Hu 《工程优选》2019,51(1):101-119
Sequential optimization and reliability analysis (SORA) is an efficient approach to reliability-based design (RBD). It decouples the double-loop structure of RBD into a serial cycle of deterministic optimization and reliability analysis. The first order approximation is used in SORA for reliability analysis owing to its good balance between accuracy and efficiency. However, it may result in a large error when a constraint function is highly nonlinear. This study proposes a new numerical method so that second order approximations for the reliability analysis can be used for higher accuracy. To minimize the increased computational cost due to second order approximations, this study also develops an efficient algorithm for searching for an equivalent reliability index with the help of the saddlepoint approximation. The efficiency and accuracy of the proposed method are verified through numerical examples.  相似文献   

13.
The approximation of general shell problems by flat plate elements is very popular among engineering people. These methods have as common features a nonconforming approximation of the geometry of the considered shell using facet elements and a pseudo-conforming approximation of the components of the displacement, i.e., an approximation using conforming plate elements over every flat element. In this work, we analyze the compatibility conditions which have to be satisfied by the degrees of freedom at every node of the triangulation. Next, we obtain several interesting results valid for general shells and we prove the “pseudo-convergence” of the method for a class of shallow shells. Then, this careful study allows to introduce a perturbation of the bending terms upon each facet; the corresponding new method is convergent for arbitrary thin shells.  相似文献   

14.
The main goal of this contribution is the improvement of the approximation quality of least-squares mixed finite elements for static and dynamic problems in quasi-incompressible elasticity. Compared with other variational approaches as for example the Galerkin method, the main drawback of least-squares formulations is the unsatisfying approximation quality in terms of accuracy and robustness. Here, lower-order elements are especially affected, see e.g. [33]. In order to circumvent these problems, we introduce overconstrained first-order systems with suited weights. We consider different mixed least-squares formulations depending on stresses and displacements with a maximal cubical polynomial interpolation. For the continuous approximation of the stresses Raviart–Thomas elements are used, while for the displacements standard conforming elements are employed. Some numerical benchmarks are presented in order to validate the performance and efficiency of the proposed formulations.  相似文献   

15.
Robust parameter designs are widely used to produce products/processes that perform consistently well across various conditions known as noise factors. Recently, the robust parameter design method is implemented in computer experiments. The structure of conventional product array design becomes unsuitable due to its extensive number of runs and the polynomial modeling. In this article, we propose a new framework robust parameter design via stochastic approximation (RPD-SA) to efficiently optimize the robust parameter design criteria. It can be applied to general robust parameter design problems, but is particularly powerful in the context of computer experiments. It has the following four advantages: (1) fast convergence to the optimal product setting with fewer number of function evaluations; (2) incorporation of high-order effects of both design and noise factors; (3) adaptation to constrained irregular region of operability; (4) no requirement of statistical analysis phase. In the numerical studies, we compare RPD-SA to the Monte Carlo sampling with Newton–Raphson-type optimization. An “Airfoil” example is used to compare the performance of RPD-SA, conventional product array designs, and space-filling designs with the Gaussian process. The studies show that RPD-SA has preferable performance in terms of effectiveness, efficiency and reliability.  相似文献   

16.
We consider the problem of estimating multicomponent stress-strength (MSS) reliability under progressive Type II censoring when stress and strength variables follow unit Gompertz distributions with common scale parameter. We estimate MSS reliability under frequentist and Bayesian approaches. Bayes estimates are obtained by using Lindley approximation and Metropolis-Hastings algorithm methods. Further, we obtain uniformly minimum variance unbiased estimates of the reliability when common scale parameter is known. Asymptotic, bootstrap confidence interval and highest posterior density credible intervals have been constructed. We perform Monte Carlo simulations to compare the performance of proposed estimates and also present a discussion. Finally, three real data sets are analyzed for illustrative purposes.  相似文献   

17.
This paper deals with the integrated facility location and supplier selection decisions for the design of supply chain network with reliable and unreliable suppliers. Two problems are addressed: (1) facility location/supplier selection; and (2) facility location/supplier reliability. We first consider the facility location and supplier selections problem where all the suppliers are reliable. The decisions concern the selection of suppliers, the location of distribution centres (DCs), the allocation of suppliers to DCs and the allocation of retailers to DCs. The objective is to minimise fixed DCs location costs, inventory and safety stock costs at the DCs and ordering costs and transportation costs across the network. The introduction of inventory costs and safety stock costs leads to a non-linear NP-hard optimisation problem. To solve this problem, a Lagrangian relaxation-based approach is developed. For the second problem, a two-period decision model is proposed in which selected suppliers are reliable in the first period and can fail in the second period. The corresponding facility location/supplier reliability problem is formulated as a non-linear stochastic programming problem. A Monte Carlo optimisation approach combining the sample average approximation scheme and the Lagrangian relaxation-based approach is proposed. Computational results are presented to evaluate the efficiency of the proposed approaches.  相似文献   

18.
Zhen Hu 《工程优选》2016,48(8):1296-1312
Time-dependent reliability-based design ensures the satisfaction of reliability requirements for a given period of time, but with a high computational cost. This work improves the computational efficiency by extending the sequential optimization and reliability analysis (SORA) method to time-dependent problems with both stationary stochastic process loads and random variables. The challenge of the extension is the identification of the most probable point (MPP) associated with time-dependent reliability targets. Since a direct relationship between the MPP and reliability target does not exist, this work defines the concept of equivalent MPP, which is identified by the extreme value analysis and the inverse saddlepoint approximation. With the equivalent MPP, the time-dependent reliability-based design optimization is decomposed into two decoupled loops: deterministic design optimization and reliability analysis, and both are performed sequentially. Two numerical examples are used to show the efficiency of the proposed method.  相似文献   

19.
Teschke M  Sinzinger S 《Applied optics》2008,47(26):4767-4776
We report novel approaches to the design of halftone masks for analog lithography. The approaches are derived from interferometric phase contrast. In a first step we show that the interferometric phase-contrast method with detour holograms can be reduced into a single binary mask. In a second step we introduce the interferometric phase-contrast method by interference of the object wavefront with the conjugate object wavefront. This method also allows for a design of a halftone mask. To use kinoform holograms as halftone phase masks, we show in a third step the combination of the zeroth-order phase-contrast technique with the interferometric phase-contrast method.  相似文献   

20.
将可靠性优化设计理论与可靠性灵敏度分析方法相结合,讨论了机械零部件稳健优化设计的问题.系统地推导了基于鞍点逼近的可靠性灵敏度公式,并把可靠性灵敏度计算结果融入可靠性稳健优化设计模型之中,将可靠性稳健优化设计归结为满足可靠性要求的多目标优化问题.在基本随机参数概率分布已知的前提下,应用鞍点逼近技术,得到极限状态函数的分布函数与概率密度函数,并且将此结果应用到机械零部件的可靠性灵敏度分析中,进而实现了机械零部件的可靠性稳健优化设计.通过与Monte-Carlo方法计算所得的结果相比可知,应用鞍点逼近技术可以迅速、准确地得到机械零部件可靠性稳健设计信息.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号