This paper introduces Weighted Optimal Classification Forests (WOCFs), a new family of classifiers that takes advantage of an optimal ensemble of decision trees to derive accurate and interpretable classifiers. We propose a novel mathematical optimization-based methodology which simultaneously constructs a given number of trees, each of them providing a predicted class for the observations in the feature space. The classification rule is derived by assigning to each observation its most frequently predicted class among the trees. We provide a mixed integer linear programming formulation (MIP) for the problem and several novel MIP strengthening / scaling techniques. We report the results of our computational experiments, from which we conclude that our method has equal or superior performance compared with state-of-the-art tree-based classification methods for small to medium-sized instances. We also present three real-world case studies showing that our methodology has very interesting implications in terms of interpretability. Overall, WOCFs complement existing methods such as CART, Optimal Classification Trees, Random Forests and XGBoost. In addition to its Pareto improvement on accuracy and interpretability, we also see unique properties emerging in terms of different trees focusing on different feature variables. This provides nontrivial improvement in interpretability and usability of the trained model in terms of counterfactual explanation. Thus, despite the apparent computational challenge of WOCFs that limit the size of the problems that can be efficiently solved with current MIP, this is an important research direction that can lead to qualitatively different insights for researchers and complement the toolbox of practitioners for high stakes problems.
翻译:暂无翻译