The construction of efficient and effective decision trees remains a key topic in machine learning because of their simplicity and flexibility. A lot of heuristic algorithms have been proposed to construct near-optimal decision trees. ID3, C4.5 and CART are classical decision tree algorithms and the split criteria they used are Shannon entropy, Gain Ratio and Gini index respectively. All the split criteria seem to be independent, actually, they can be unified in a Tsallis entropy framework. Tsallis entropy is a generalization of Shannon entropy and provides a new approach to enhance decision trees' performance with an adjustable parameter $q$. In this paper, a Tsallis Entropy Criterion (TEC) algorithm is proposed to unify Shannon entropy, Gain Ratio and Gini index, which generalizes the split criteria of decision trees. More importantly, we reveal the relations between Tsallis entropy with different $q$ and other split criteria. Experimental results on UCI data sets indicate that the TEC algorithm achieves statistically significant improvement over the classical algorithms.
翻译:高效和有效的决策树的建设由于其简单和灵活性,仍然是机器学习中的一个关键主题。已经提出了许多超自然算法,以建造接近最佳的决策树。ID3、C4.5和CART是典型的决策树算法,它们分别使用的是香农石图、增益率和基尼指数。所有拆分标准似乎都是独立的,实际上,它们可以在Tsallis entropy框架中统一起来。Tsallis entropy是香农石柱的概括,提供了一种用可调整参数$q美元来提高决策树性能的新方法。在本文件中,提议采用Tsallis Entropy Crightion(TEC)算法,以统一香农石图、增益率和吉尼指数,后者概括了决策树的拆分标准。更重要的是,我们揭示了Tsallis entropy与不同的美元和其他拆分标准之间的关系。UCI数据集的实验结果表明,TEC算法在统计上取得了显著的改进。