Efficient multi-criteria optimization on noisy machine learning problems |
| |
Affiliation: | 1. Cologne University of Applied Sciences, Germany;2. Institut für Spanende Fertigung, TU Dortmund, Germany;3. Leiden Institute of Advanced Computer Science, Netherlands;1. School of Information Science & Engineering, Central South University, Changsha, Hunan 410083, China;2. Research Organization of Information and Systems, 4-3-13 Toranomon, Minato-ku, Tokyo 105-0001, Japan;3. The Institute of Statistical Mathematics, 10-3 Midori-cho, Tachikawa, Tokyo 190-8562, Japan;4. Hunan Province Higher Education Key Laboratory of Power System Safety Operation and Control, Changsha University of Science and Technology, Changsha, Hunan 410004, China;5. School of Business, Central South University, Changsha, Hunan 410083, China;6. Collaborative Innovation Center of Resource-Conserving & Environment-Friendly Society and Ecological Civilization, Changsha, Hunan 410083, China;1. Samsung Electronics, Republic of Korea;2. Intelligent Systems and Emotional Engineering (ISEE) Laboratory, Department of Mechatronics Engineering, Chungnam National University, Republic of Korea;1. International Islamic University, Islamabad, Pakistan;2. Gwangju Institute of Science and Technology, South Korea;1. Department of Accounting Information, Takming University of Science and Technology, Taipei 114, Taiwan, ROC;2. Department of Computer Science and Information Engineering, Chinese Culture University, Taipei 111, Taiwan, ROC;3. Department of Management Information Systems, National Chengchi University, Taipei 116, Taiwan, ROC |
| |
Abstract: |  Recent research revealed that model-assisted parameter tuning can improve the quality of supervised machine learning (ML) models. The tuned models were especially found to generalize better and to be more robust compared to other optimization approaches. However, the advantages of the tuning often came along with high computation times, meaning a real burden for employing tuning algorithms. While the training with a reduced number of patterns can be a solution to this, it is often connected with decreasing model accuracies and increasing instabilities and noise. Hence, we propose a novel approach defined by a two criteria optimization task, where both the runtime and the quality of ML models are optimized. Because the budgets for this optimization task are usually very restricted in ML, the surrogate-assisted Efficient Global Optimization (EGO) algorithm is adapted. In order to cope with noisy experiments, we apply two hypervolume indicator based EGO algorithms with smoothing and re-interpolation of the surrogate models. The techniques do not need replicates. We find that these EGO techniques can outperform traditional approaches such as latin hypercube sampling (LHS), as well as EGO variants with replicates. |
| |
Keywords: | Machine learning Multi-criteria optimization Efficient Global Optimization Kriging Hypervolume indicator |
本文献已被 ScienceDirect 等数据库收录! |
|