首页 | 本学科首页   官方微博 | 高级检索  
     

一种基于粗糙集理论的决策树构造方法
引用本文:于海平,朱玉全,陈耿,欧吉顺. 一种基于粗糙集理论的决策树构造方法[J]. 计算机应用与软件, 2011, 28(2)
作者姓名:于海平  朱玉全  陈耿  欧吉顺
作者单位:1. 江苏大学计算机科学与通信工程学院,江苏,镇江,212013
2. 南京审计学院江苏省级审计信息工程重点实验室,江苏,南京,210029
基金项目:江苏省“青蓝工程”、六大人才高峰(07-E-025); 江苏省高校自然科学重大基金研究(08KJA520001)
摘    要:采用粗糙集理论中的属性重要度作为挑选测试属性的指标来构造决策树,形成了一种新的决策树分类算法S_D_Tree,在计算挑选测试属性的时间复杂度为O(|C||n|)。实验结果表明,该算法可以构建一个较简洁的决策树,与C4.5算法相比较,具有更好的预测准确率。

关 键 词:决策树  粗糙集  属性重要度  时间复杂度  

A METHOD FOR CONSTRUCTING DECISION TREE BASED ON ROUGH SET
Yu Haiping,Zhu Yuquan,Chen Geng,Ou Jishun. A METHOD FOR CONSTRUCTING DECISION TREE BASED ON ROUGH SET[J]. Computer Applications and Software, 2011, 28(2)
Authors:Yu Haiping  Zhu Yuquan  Chen Geng  Ou Jishun
Affiliation:Yu Haiping1 Zhu Yuquan1 Chen Geng2 Ou Jishun1 1(School of Computer Science and Telecommunications Engineering,Jiangsu University,Zhenjiang 212013,Jiangsu,China) 2(Jiangsu Key Laboratory of Audit Information Engineering,Nanjing Audit University,Nanjing 210029,China)
Abstract:In this paper we use the significance of the attribute in rough set theory as the index to select splitting attributes for constructing the decision tree,and put forward a new decision tree classification algorithm S_D_Tree,of which the time complexity for selecting splitting attribute is O(|C||n|).Experimental results on three data sets demonstrate that the proposed algorithm can construct a less complex decision tree,and can also obtain comparative classification accuracy compared with C4.5.
Keywords:Decision tree Rough set Significance of attribute Time complexity  
本文献已被 CNKI 万方数据 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号