Learning decision tree for ranking |
| |
Authors: | Liangxiao Jiang Chaoqun Li Zhihua Cai |
| |
Affiliation: | (1) Faculty of Computer Science, China University of Geosciences, 430074 Wuhan, China;(2) Faculty of Mathematics, China University of Geosciences, 430074 Wuhan, China |
| |
Abstract: | Decision tree is one of the most effective and widely used methods for classification. However, many real-world applications
require instances to be ranked by the probability of class membership. The area under the receiver operating characteristics
curve, simply AUC, has been recently used as a measure for ranking performance of learning algorithms. In this paper, we present
two novel class probability estimation algorithms to improve the ranking performance of decision tree. Instead of estimating
the probability of class membership using simple voting at the leaf where the test instance falls into, our algorithms use
similarity-weighted voting and naive Bayes. We design empirical experiments to verify that our new algorithms significantly
outperform the recent decision tree ranking algorithm C4.4 in terms of AUC.
|
| |
Keywords: | Ranking Class probability estimation Decision trees Voting Similarity-weighted voting Naive Bayes |
本文献已被 SpringerLink 等数据库收录! |
|