Decision trees and their ensembles are endowed with a rich set of diagnostic tools for ranking and screening input variables in a predictive model. One of the most commonly used in practice is the Mean Decrease in Impurity (MDI), which calculates an importance score for a variable by summing the weighted impurity reductions over all non-terminal nodes split with that variable. Despite the widespread use of tree based variable importance measures such as MDI, pinning down their theoretical properties has been challenging and therefore largely unexplored. To address this gap between theory and practice, we derive rigorous finite sample performance guarantees for variable ranking and selection in nonparametric models with MDI for a single-level CART decision tree (decision stump). We find that the marginal signal strength of each variable and ambient dimensionality can be considerably weaker and higher, respectively, than state-of-the-art nonparametric variable selection methods. Furthermore, unlike previous marginal screening methods that attempt to directly estimate each marginal projection via a truncated basis expansion, the fitted model used here is a simple, parsimonious decision stump, thereby eliminating the need for tuning the number of basis terms. Thus, surprisingly, even though decision stumps are highly inaccurate for estimation purposes, they can still be used to perform consistent model selection.
翻译:在预测模型中,决策树及其集合具有一套丰富的用于排名和筛选输入变量的诊断工具。在实践上最常用的一种是杂质平均减少(MDI),它计算了一个变量的重要分数,将所有非终点节点的加权杂质减少与该变量分开。尽管广泛使用基于树的可变重要性措施,如MDI,但将其理论属性定下来一直具有挑战性,因此基本上没有探索。为了缩小理论和实践之间的差距,我们为非对称模型中的变量定级和选择,为单级 CART 决策树(决定立木)的MDI 获得严格的抽样性能保障。我们发现,每个变量和环境多元性边际信号强度的比目前状态的非参数选择方法要弱和高得多。此外,与以往试图通过边际基础扩展直接估计每项边际预测的边际筛选方法不同,我们在这里使用的模型是简单、简单化的决定立项,从而消除了对单级决定树脂值的模型,从而消除了对标准值的精确性估算,尽管采用了标准,但仍可以令人吃惊地进行精确的估算。