首页 | 本学科首页   官方微博 | 高级检索  
     


Representing Probabilistic Rules with Networks of Gaussian Basis Functions
Authors:Tresp  Volker  Hollatz  Jürgen  Ahmad   Subutai
Affiliation:(1) Siemens AG, Central Research, 81730 München, Germany;(2) Interval Research Corporation, 1801-C Page Mill Rd., Palo Alto, CA, 94304
Abstract:There is great interest in understanding the intrinsic knowledge neural networks have acquired during training. Most work in this direction is focussed on the multi-layer perceptron architecture. The topic of this paper is networks of Gaussian basis functions which are used extensively as learning systems in neural computation. We show that networks of Gaussian basis functions can be generated from simple probabilistic rules. Also, if appropriate learning rules are used, probabilistic rules can be extracted from trained networks. We present methods for the reduction of network complexity with the goal of obtaining concise and meaningful rules. We show how prior knowledge can be refined or supplemented using data by employing either a Bayesian approach, by a weighted combination of knowledge bases, or by generating artificial training data representing the prior knowledge. We validate our approach using a standard statistical data set.
Keywords:Neural networks  theory refinement  knowledge-based neural networks  probability density estimation  knowledge extraction  mixture densities  combining knowledge bases  Bayesian learning
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号