首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
决策树算法在天气评估中的应用   总被引:1,自引:0,他引:1  
巩固  张虹 《微计算机信息》2007,23(34):245-247
分类算法是数据挖掘中的一个最重要技术.本文分析了决策树分类算法中的ID3算法和C4.5算法.利用它们建立天气评估的决策树模型.研究了该决策树模型在天气评估中的应用,分析了决策树算法应用于数据分类和知识发现的过程和特点.同时文章中也指出了分类算法的不足之处和待解决的问题。  相似文献   

2.
随机森林分类算法在产生决策树以及投票流程中各个决策树的分类准确度各不相同,由此带来的问题是少部分决策树会影响随机森林算法的整体分类性能。除此以外,数据集中的不平衡数据也能影响到决策树的分类精度。针对以上缺点,对Bootstrap抽样方法添加约束条件,以降低非平衡数据对生成决策树的影响;以及利用袋外数据(Outof-Bagging)和非平衡系数对生成的决策树进行评估加权。试验结果表明,所提算法改善了随机森林对不平衡数据的分类精度。  相似文献   

3.
阐明决策树分类器在用于分类的数据挖掘技术中依然重要,论述基于决策树归纳分类的ID3、C4.5算法,并且对决策属性的选取法则进行说明。通过实例解析ID3、C4.5算法实现过程,结果表明C4.5算法相比较于ID3算法的优越性.尤其在处理具有多属性值的数据时的更加合理和正确。  相似文献   

4.
基于遗传算法的决策树优化模型   总被引:1,自引:0,他引:1  
在分析C4.5算法原理的基础上,进一步讨论了C4.5算法在决策树的规模控制、属性选择、滤躁和去除不相关属性等方面的不足,讨论了决策树挖掘中对训练数据进行属性约简的必要性。从实用的角度提出了一种利用遗传算法进行寻优的、基于属性约简的决策树构建模型,并为此模型设计了一个适应度函数。该模型具有自适应的特点,通过调整适应度函数的参数,可以约束遗传算法的寻优方向,实现对决策树的优化。实验表明,决策树寻优后,在所用训练集属性减少的同时,分类精度却有一定程度的提高.而分类规则的规模却降低了.因此,该模型具有一定的实用价值。  相似文献   

5.
决策树是归纳学习和数据挖掘的重要方法,主要用于分类和预测。文章引入了广义决策树的概念,实现了分类规则集和决策树结构的统一。同时,提出一种新颖的基于DNA编码遗传算法构造决策树的方法。先用C4.5算法对数据集进行分类得到初始规则集,再通过文章中算法优化规则集并由此构建决策树。实验证明了该方法有效地避免了传统决策树构建过程的缺点,且有较好的并行性。  相似文献   

6.
神经网络集成方法具有比单个神经网络更强的泛化能力,却因为其黑箱性而难以理解;决策树算法因为分类结果显示为树型结构而具有良好的可理解性,泛化能力却比不上神经网络集成。该文将这两种算法相结合,提出一种决策树的构造算法:使用神经网络集成来预处理训练样本,使用C4.5算法处理预处理后的样本并生成决策树。该文在UCI数据上比较了神经网络集成方法、决策树C4.5算法和该文算法,实验表明:该算法具有神经网络集成方法的强泛化能力的优点,其泛化能力明显优于C4.5算法;该算法的最终结果昆示为决策树,显然具有良好的可理解性。  相似文献   

7.
C4.5算法在列车轨道故障检测上的应用研究   总被引:1,自引:0,他引:1  
列车轨道故障检测的实现需要对大量的数据进行分析来判定检测结果,决策树是进行数据挖掘与分类分析的常用工具。文中主要讨论如何应用C4.5算法构造列车轨道故障检测的决策树以及根据生成的决策树实现轨道故障的判决。  相似文献   

8.
传统的决策树分类方法(如ID3和C4.5),对于相对小的数据集是很有效的。但是,当这些算法用于入侵检测这样的非常大的、现实世界中的数据时,其有效性就显得不足。采用了一种基于随机模型的决策树算法,在保证分类准确率的基础上,减少了对系统资源的占用,并通过对比实验表明该算法在对计算机入侵数据的分类上有着出色的表现。  相似文献   

9.
数据挖掘就是从海量的数据中挖掘出可能有潜在价值的信息的技术。决策树方法是一种典型的分类算法.首先对数据进行处理,利用归纳算法生成可读的规则和决策树模型,然后使用决策树模型对新数据进行分析。该文以大学生专业方向指导辅助系统的开发过程为实例从理论上论述了数据挖掘的概念、数据挖掘研究内容和本质以及进行数据挖掘的主要方法。讲述了使用MATLAB7.0开发实现决策树算法子系统的方法和实现,并且对生成的决策树模型进行分析。  相似文献   

10.
研究模糊聚类分析在医学图像数据挖掘中的应用。利用决策树算法对乳腺癌图像数据进行分类,实现了一个基于决策树算法的医学图像分类器,获得了分类的实验结果。该模型系统达到了较高的分类准确率,证明数据挖掘在辅助医疗诊断中有着广泛的应用前景。  相似文献   

11.
Decision trees have been widely used in data mining and machine learning as a comprehensible knowledge representation. While ant colony optimization (ACO) algorithms have been successfully applied to extract classification rules, decision tree induction with ACO algorithms remains an almost unexplored research area. In this paper we propose a novel ACO algorithm to induce decision trees, combining commonly used strategies from both traditional decision tree induction algorithms and ACO. The proposed algorithm is compared against three decision tree induction algorithms, namely C4.5, CART and cACDT, in 22 publicly available data sets. The results show that the predictive accuracy of the proposed algorithm is statistically significantly higher than the accuracy of both C4.5 and CART, which are well-known conventional algorithms for decision tree induction, and the accuracy of the ACO-based cACDT decision tree algorithm.  相似文献   

12.
决策树是数据挖掘的分类应用中采用最广泛的模型之一,但是传统的ID3、C4.5和CART等算法在应用于超大型数据库的挖掘时,有效性会降得很低,甚至出现内存溢出的现象,针对此本文提出了一种基于属性加权的随机决策树算法,并通过实验证明该算法减少了对系统资源的占用,并且对高维的大数据集具有很高的分类准确率,非常适合被用于入侵检测的分类之中。  相似文献   

13.
潜在属性空间树分类器   总被引:2,自引:0,他引:2  
何萍  徐晓华  陈崚 《软件学报》2009,20(7):1735-1745
提出一种潜在属性空间树分类器(latent attribute space tree classifier,简称LAST)框架,通过将原属性空间变换到更容易分离数据或更符合决策树分类特点的潜在属性空间,突破传统决策树算法的决策面局限,改善树分类器的泛化性能.在LAST 框架下,提出了两种奇异值分解斜决策树(SVD (singular value decomposition) oblique decision tree,简称SODT)算法,通过对全局或局部数据进行奇异值分解,构建正交的潜在属性空间,然后在潜在属性空间内构建传统的单变量决策树或树节点,从而间接获得原空间内近似最优的斜决策树.SODT 算法既能够处理整体数据与局部数据分布相同或不同的数据集,又可以充分利用有标签和无标签数据的结构信息,分类结果不受样本随机重排的影响,而且时间复杂度还与单变量决策树算法相同.在复杂数据集上的实验结果表明,与传统的单变量决策树算法和其他斜决策树算法相比,SODT 算法的分类准确率更高,构建的决策树大小更稳定,整体分类性能更鲁棒,决策树构建时间与C4.5 算法相近,而远小于其他斜决策树算法.  相似文献   

14.
一种基于属性加权的决策树算法   总被引:1,自引:0,他引:1  
ID3算法和C4.5算法是简单而有效的决策树分类算法,但其应用于复杂决策问题上存在准确性差的问题。本文提出了一种新的基于属性加权决策树算法,基于粗集理论提出通过属性对决策影响程度的不同进行加权来构建决策树,提高了决策结果准确性。通过属性加权标记属性的重要性,权值可以从训练数据中学习得到。实验结果表明,算法明显提高了决策结果的准确率。  相似文献   

15.
Lim  Tjen-Sien  Loh  Wei-Yin  Shih  Yu-Shan 《Machine Learning》2000,40(3):203-228
Twenty-two decision tree, nine statistical, and two neural network algorithms are compared on thirty-two datasets in terms of classification accuracy, training time, and (in the case of trees) number of leaves. Classification accuracy is measured by mean error rate and mean rank of error rate. Both criteria place a statistical, spline-based, algorithm called POLYCLSSS at the top, although it is not statistically significantly different from twenty other algorithms. Another statistical algorithm, logistic regression, is second with respect to the two accuracy criteria. The most accurate decision tree algorithm is QUEST with linear splits, which ranks fourth and fifth, respectively. Although spline-based statistical algorithms tend to have good accuracy, they also require relatively long training times. POLYCLASS, for example, is third last in terms of median training time. It often requires hours of training compared to seconds for other algorithms. The QUEST and logistic regression algorithms are substantially faster. Among decision tree algorithms with univariate splits, C4.5, IND-CART, and QUEST have the best combinations of error rate and speed. But C4.5 tends to produce trees with twice as many leaves as those from IND-CART and QUEST.  相似文献   

16.
NeC4.5: neural ensemble based C4.5   总被引:5,自引:0,他引:5  
Decision tree is with good comprehensibility while neural network ensemble is with strong generalization ability. These merits are integrated into a novel decision tree algorithm NeC4.5. This algorithm trains a neural network ensemble at first. Then, the trained ensemble is employed to generate a new training set through replacing the desired class labels of the original training examples with those output from the trained ensemble. Some extra training examples are also generated from the trained ensemble and added to the new training set. Finally, a C4.5 decision tree is grown from the new training set. Since its learning results are decision trees, the comprehensibility of NeC4.5 is better than that of neural network ensemble. Moreover, experiments show that the generalization ability of NeC4.5 decision trees can be better than that of C4.5 decision trees.  相似文献   

17.
The optic nerve disease is an important disease that appears commonly in public. In this paper, we propose a hybrid diagnostic system based on discretization (quantization) method and classification algorithms including C4.5 decision tree classifier, artificial neural network (ANN), and least square support vector machine (LSSVM) to diagnose the optic nerve disease from Visual Evoked Potential (VEP) signals with discrete values. The aim of this paper is to investigate the effect of Discretization method on the classification of optic nerve disease. Since the VEP signals are non-linearly-separable, low classification accuracy can be obtained by classifier algorithms. In order to overcome this problem, we have used the Discretization method as data pre-processing. The proposed method consists of two phases: (i) quantization of VEP signals using Discretization method, and (ii) diagnosis of discretized VEP signals using classification algorithms including C4.5 decision tree classifier, ANN, and LSSVM. The classification accuracies obtained by these hybrid methods (combination of C4.5 decision tree classifier-quantization method, combination of ANN-quantization method, and combination of LSSVM-quantization method) with and without quantization strategy are 84.6-96.92%, 94.20-96.76%, and 73.44-100%, respectively. As can be seen from these results, the best model used to classify the optic nerve disease from VEP signals is obtained for the combination of LSSVM classifier and quantization strategy. The obtained results denote that the proposed method can make an effective interpretation and point out the ability of design of a new intelligent assistance diagnosis system.  相似文献   

18.
从熵均值决策到样本分布决策   总被引:15,自引:0,他引:15       下载免费PDF全文
为了研究归纳学习的判决精度问题,分析了C4.5算法的不足以及标准算法与亚算法之间争论和妥协的根本原因,从估计训练样本的概率分布的角度出发,给出了一种简单而新颖的决策树算法.基于UCI数据的实验结果表明,与C4.5算法相比,该方法不仅具有比较好的判决精度,而且具有更快的计算速度.  相似文献   

19.
应用C4.5算法构造客户分类决策树的方法   总被引:23,自引:0,他引:23  
客户等级划分是CRM中一个非常重要的方面。而决策树是进行分类分析的一个常用工具,该文主要讨论如何应用C45算法构造客户等级分类决策树及其在CRM中的应用。  相似文献   

20.
Most decision‐tree induction algorithms are using a local greedy strategy, where a leaf is always split on the best attribute according to a given attribute‐selection criterion. A more accurate model could possibly be found by looking ahead for alternative subtrees. However, some researchers argue that the look‐ahead should not be used due to a negative effect (called “decision‐tree pathology”) on the decision‐tree accuracy. This paper presents a new look‐ahead heuristics for decision‐tree induction. The proposed method is called look‐ahead J48 ( LA‐J48) as it is based on J48, the Weka implementation of the popular C4.5 algorithm. At each tree node, the LA‐J48 algorithm applies the look‐ahead procedure of bounded depth only to attributes that are not statistically distinguishable from the best attribute chosen by the greedy approach of C4.5. A bootstrap process is used for estimating the standard deviation of splitting criteria with unknown probability distribution. Based on a separate validation set, the attribute producing the most accurate subtree is chosen for the next step of the algorithm. In experiments on 20 benchmark data sets, the proposed look‐ahead method outperforms the greedy J48 algorithm with the gain ratio and the gini index splitting criteria, thus avoiding the look‐ahead pathology of decision‐tree induction.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号