首页 | 本学科首页   官方微博 | 高级检索  
     


Part 2: Multilayer perceptron and natural gradient learning
Authors:Hyeyong Park
Affiliation:(1) Computer Science Dept., Kyungpook National University, Sankyuk-dong, Buk-gu, 702-701 Daegu, Korea
Abstract:Since the perceptron was developed for learning to classify input patterns, there have been plenty of studies on simple perceptrons and multilayer perceptrons. Despite wide and active studies in theory and applications, multilayer perceptrons still have many unsettled problems such as slow learning speed and overfitting. To find a thorough solution to these problems, it is necessary to consolidate previous studies, and find new directions for uplifting the practical power of multilayer perceptrons. As a first step toward the new stage of studies on multilayer perceptrons, we give short reviews on two interesting and important approaches; one is stochastic approach and the other is geometric approach. We also explain an efficient learning algorithm developed from the statistical and geometrical studies, which is now well known as the natural gradient learning method. Hyeyoung Park, Ph.D.: She is Assistant Professor of Computer Sciences at School of Electrical Engineering and Computer Science of Kyungpook National University in Korea. She received her B.S., M.A. and Ph.D. from Yonsei University of Korea in 1994, 1996, and 2000. She also worked as a research scinetist at Brain Science Institute in RIKEN from 2000 to 2004. Her research insterest is in learning thoeries and pattern recognition as well as statistical data analysis.
Keywords:Multilayer Perceptrons  Gradient Decent Learning  Backpropagation Learning  Natural Gradient  Singularity  Stochastic Neural Networks  Neuromanifold
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号