首页 | 本学科首页   官方微博 | 高级检索  
     


Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts
Authors:Saeed Masoudnia  Reza Ebrahimpour  Seyed Ali Asghar Abbaszadeh Arani
Affiliation:1. School of Mathematics, Statistics and Computer Science, University of Tehran, Tehran, Iran
2. Brain & Intelligent Systems Research Laboratory, Department of Electrical and Computer Engineering, Shahid Rajaee Teacher Training University, 16785-163, Tehran, Iran
3. School of Cognitive Sciences (SCS), Institute for Research in Fundamental Sciences (IPM), 19395-5746, Tehran, Iran
Abstract:Combining accurate neural networks (NN) in the ensemble with negative error correlation greatly improves the generalization ability. Mixture of experts (ME) is a popular combining method which employs special error function for the simultaneous training of NN experts to produce negatively correlated NN experts. Although ME can produce negatively correlated experts, it does not include a control parameter like negative correlation learning (NCL) method to adjust this parameter explicitly. In this study, an approach is proposed to introduce this advantage of NCL into the training algorithm of ME, i.e., mixture of negatively correlated experts (MNCE). In this proposed method, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability. The proposed hybrid ensemble method, MNCE, is compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed ensemble method significantly improves the performance over the original ensemble methods.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号