首页 | 本学科首页   官方微博 | 高级检索  
     

基于决策熵的决策树规则提取方法
引用本文:孙林,徐久成,马媛媛.基于决策熵的决策树规则提取方法[J].微机发展,2007,17(6):97-100.
作者姓名:孙林  徐久成  马媛媛
作者单位:河南师范大学计算机与信息技术学院 河南新乡453007
基金项目:河南省自然科学基金项目(0511011500),河南省高校新世纪优秀人才支持计划(2006HANCET-19)
摘    要:在决策表中,决策规则的可信度和对象覆盖度是衡量决策能力的重要指标。以知识粗糙熵为基础,提出决策熵的概念,并定义其属性重要性;然后以条件属性子集的决策熵来度量其对决策分类的重要性,自顶向下递归构造决策树;最后遍历决策树,简化所获得的决策规则。该方法的优点在于构造决策树及提取规则前不进行属性约简,计算直观,时间复杂度较低。实例分析的结果表明,该方法能获得更为简化有效的决策规则。

关 键 词:粗糙集  决策熵  决策树  决策规则
文章编号:1673-629(2007)06-0097-04
修稿时间:2006年9月20日

Algorithm for Rules Extraction of Decision Tree Based on Decision Information Entropy
SUN Lin,XU Jiu-cheng,MA Yuan-yuan.Algorithm for Rules Extraction of Decision Tree Based on Decision Information Entropy[J].Microcomputer Development,2007,17(6):97-100.
Authors:SUN Lin  XU Jiu-cheng  MA Yuan-yuan
Abstract:In decision table,the reliability and objects coverage of decision rules are the most important performance metric for estimating decision ability.Based on rough entropy of knowledge,a new decision information entropy is proposed.The new significance of an attribute is defined,which is based on this entropy.In the process of constructing decision tree,condition attributes are considered to estimate the significance for decision classes.A procedure for reduction of traversing decision rules is also constructed,and helps to get more precise rules.The benefit of the method is that it needn't attribute reduction before extracting decision rules,and its computation is simple and intuitionistic.The experiment and comparison show that the algorithm provides more precise and simple decision rules.
Keywords:rough set  decision information entropy  decision tree  decision rules
本文献已被 CNKI 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号