首页 | 本学科首页   官方微博 | 高级检索  
     

通过最小二乘拟合方法删除神经网络的隐节点
引用本文:梁循.通过最小二乘拟合方法删除神经网络的隐节点[J].微机发展,2005,15(1):4-7.
作者姓名:梁循
作者单位:北京大学计算机所 北京100871
摘    要:提出了一种基于隐节点输出行向量的最小二乘拟合来删除神经网络隐节点的方法。它分成两步,首先分析隐节点输出行向量的正交投影间的关系,通过最小二乘拟合找出可以最准确地被其它隐节点输出行向量表达的隐节点输出行向量,然后将该隐节点的作用利用最小二乘拟合系数进行纵向传播分摊到其它隐节点。最后删除该隐节点,并进行再训练,实验结果表明再训练所需时间很短,因而它的实用性很强。

关 键 词:最小二乘拟合  正交投影  纵向传播  隐节点  删除
文章编号:1005-3751(2005)01-0004-04
修稿时间:2004年8月24日

Removal of Hidden Neurons Based on Least Square Approximation
LIANG Xun.Removal of Hidden Neurons Based on Least Square Approximation[J].Microcomputer Development,2005,15(1):4-7.
Authors:LIANG Xun
Abstract:A method of removing hidden neurons based on least square approximation among hidden layer output vectors is presented. It is divided into two stages, first, by analyzing the relations of orthogonal projections among the hidden layer output vectors, the hidden neuron which can be best expressed by the other hidden output vectors is found. Then based on the coefficients of the least square approximation, the influence of the removed hidden neuron is distributed to other hidden neurons. Finally the new network is re-trained with the initial values of the current weights and thresholds. Experimental results show that the re-training time is pretty short, and hence it is very practical.
Keywords:least square approximation  orthogonal projection  crosswise propagation  hidden neuron  removal
本文献已被 CNKI 维普 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号