Strategies for improving neural net generalisation |
| |
Authors: | Derek Partridge Niall Griffith |
| |
Affiliation: | (1) Department of Computer Science, University of Exeter, EX4 4PT Exeter, UK |
| |
Abstract: | We address the problem of training multilayer perceptrons to instantiate a target function. In particular, we explore the accuracy of the trained network on a test set of previously unseen patterns — the generalisation ability of the trained network. We systematically evaluate alternative strategies designed to improve the generalisation performance. The basic idea is to generate a diverse set of networks, each of which is designed to be an implementation of the target function. We then have a set of trained, alternative versions — a version set. The goal is to achieve useful diversity within this set, and thus generate potential for improved generalisation performance of the set as a wholewhen compared to the performance of any individual version. We define this notion of useful diversity, we define a metric for it, we explore a number of ways of generating it, and we present the results of an empirical study of a number of strategies for exploiting it to achieve maximum generalisation performance. The strategies encompass statistical measures as well as a selectornet approach which proves to be particularly promising. The selector net is a form of metanet that operates in conjunction with a version set. |
| |
Keywords: | Multilayer perceptrons Backpropagation Generalisation Generalisation diversity Majority vote Selector-net Metanet Version set |
本文献已被 SpringerLink 等数据库收录! |
|