首页 | 本学科首页   官方微博 | 高级检索  
     


The geometry of multi-layer perceptron solutions
Affiliation:

Gesellschaft für Mathematik und Datenverarbeitung (GMD), Postfach 1240, 5205, Sankt Augustin 1, FRG

Abstract:We geometrically classify multi-layer perceptron (MLP) solutions in two ways: the hyperplane partitioning interpretation and the hidden-unit representation of the pattern set. We show these classifications to be invariant under orthogonal transformations and translations in the space of the hidden units. These solitots [sic] can be enumerated for any given Boolean mapping problem. Using a geometrical argument we derive the total number of solitots available to a minimal network for the parity problem. A lower bound is computed for the scaling of the number of solitots with input vector dimension, when a fixed fraction of patterns is removed from the full training set. The generalization probability is shown to decrease with the exponential of the problem size for the parity problem. We suggest that this, and hidden layer scaling problems, are serious drawbacks to scapling-up of MLPs to larger tasks.
Keywords:Multi-layer perceptron   scaling   hyperplanes   complexity   generalization probability   minimal solutions
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号