This paper presents a new global optimization algorithm named MGOSIC to solve unconstrained expensive black-box optimization problems. In MGOSIC, three surrogate models Kriging, Radial Basis Function (RBF), and Quadratic Response Surfaces (QRS) are dynamically constructed, respectively. Additionally, a multi-point infill criterion is proposed to obtain new points in each cycle, where a score-based strategy is presented to mark cheap points generated by Latin hypercube sampling. According to their predictive values from the three surrogate models, the promising cheap points are assigned with different scores. In order to obtain the samples with diversity, a Max-Min approach is proposed to select promising sample points from the cheap point sets with higher scores. Simultaneously, the best solutions predicted by Kriging, RBF, and QRS are also recorded as supplementary samples, respectively. Once MGOSIC gets stuck in a local valley, the estimated mean square error of Kriging will be maximized to explore the sparsely sampled regions. Moreover, the whole optimization algorithm is carried out alternately in the global space and a reduced space. In summary, MGOSIC not only brings a new idea for multi-point sampling, but also builds a reasonable balance between exploitation and exploration. Finally, 19 mathematical benchmark cases and an engineering application of hydrofoil optimization are used to test MGOSIC. Furthermore, seven existing global optimization algorithms are also tested as contrast. The final results show that MGOSIC has high efficiency, strong stability, and better multi-point sampling capability in dealing with expensive black-box optimization problems.
相似文献This paper presents a sequential surrogate model method for reliability-based optimization (SSRBO), which aims to reduce the number of the expensive black-box function calls in reliability-based optimization. The proposed method consists of three key steps. First, the initial samples are selected to construct radial basis function surrogate models for the objective and constraint functions, respectively. Second, by solving a series of special optimization problems in terms of the surrogate models, local samples are identified and added in the vicinity of the current optimal point to refine the surrogate models. Third, by solving the optimization problem with the shifted constraints, the current optimal point is obtained. Then, at the current optimal point, the Monte Carlo simulation based on the surrogate models is carried out to obtain the cumulative distribution functions (CDFs) of the constraints. The CDFs and target reliabilities are used to update the offsets of the constraints for the next iteration. Therefore, the original problem is decomposed to serial cheap surrogate-based deterministic problems and Monte Carlo simulations. Several examples are adopted to verify SSRBO. The results show that the number of the expensive black-box function calls is reduced exponentially without losing of precision compared to the alternative methods, which illustrates the efficiency and accuracy of the proposed method.
相似文献In recent years, the importance of computationally efficient surrogate models has been emphasized as the use of high-fidelity simulation models increases. However, high-dimensional models require a lot of samples for surrogate modeling. To reduce the computational burden in the surrogate modeling, we propose an integrated algorithm that incorporates accurate variable selection and surrogate modeling. One of the main strengths of the proposed method is that it requires less number of samples compared with conventional surrogate modeling methods by excluding dispensable variables while maintaining model accuracy. In the proposed method, the importance of selected variables is evaluated using the quality of the model approximated with the selected variables only. Nonparametric probabilistic regression is adopted as the modeling method to deal with inaccuracy caused by using selected variables during modeling. In particular, Gaussian process regression (GPR) is utilized for the modeling because it is suitable for exploiting its model performance indices in the variable selection criterion. Outstanding variables that result in distinctly superior model performance are finally selected as essential variables. The proposed algorithm utilizes a conservative selection criterion and appropriate sequential sampling to prevent incorrect variable selection and sample overuse. Performance of the proposed algorithm is verified with two test problems with challenging properties such as high dimension, nonlinearity, and the existence of interaction terms. A numerical study shows that the proposed algorithm is more effective as the fraction of dispensable variables is high.
相似文献