首页 | 本学科首页   官方微博 | 高级检索  
     

高阶神经网络连接权的稀疏化及其删减算法
引用本文:李守丽, 李金艳, 李望超. 高阶神经网络连接权的稀疏化及其删减算法[J]. 电子与信息学报, 1999, 21(2): 182-185.
作者姓名:李守丽  李金艳  李望超
作者单位:河北工业大学计算机系,The UniVersity of Melbourne,Australia,河北工业大学计算机系 天津 300130,天津 300130
摘    要:本文首先研究完全连接型高阶神经网络的逼近能力,并证明了定义在{0,1}N上的任意布尔函数都可以由完全连接的高阶神经网络来实现。接着提出了旨在简化网络结构的去除冗余连接权删减算法,并用于高阶神经分类器的稀疏化实现。模拟实验结果证明了这种算法的有效性。

关 键 词:高阶神经网络   冗余连接权   稀疏化连接   删减算法
收稿时间:1997-08-25
修稿时间:1998-07-26

SPARSED CONNECTION WEIGHTS OF HIGHER-ORDER NEURAL NETWORK AND ITS PRUNING ALGORITHM
Li Shouli, Li Jinyan, Li Wangchao. SPARSED CONNECTION WEIGHTS OF HIGHER-ORDER NEURAL NETWORK AND ITS PRUNING ALGORITHM[J]. Journal of Electronics & Information Technology, 1999, 21(2): 182-185.
Authors:Li Shouli Li Jinyan Li Wangchao
Abstract:In this paper, the fully-connected higher-order neuron and sparsed higher-order neuron are introduced, the mapping capabilities of the fully-connected higher-order neural networks are investigated, and that arbitrary Boolean function defined from {0,1}N can be realized by fully-connected higher-order neural networks is proved. Based on this, in order to simplify the networks' architecture, a pruning algorithm for eliminating the redundant connection weights is also proposed, which can be applied to the implementation of sparsed higher-order neural classifier. The simulated results show the effectiveness of the algorithm.
Keywords:Higher-order neural networks   Redundant connection weights   Sparsed connection   Pruning algorithm
本文献已被 CNKI 等数据库收录!
点击此处可从《电子与信息学报》浏览原始摘要信息
点击此处可从《电子与信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号