Deep neural networks (DNNs) and decision trees (DTs) are both state-of-the-art classifiers. DNNs perform well due to their representational learning capabilities, while DTs are computationally efficient as they perform inference along one route (root-to-leaf) that is dependent on the input data. In this paper, we present DecisioNet (DN), a binary-tree structured neural network. We propose a systematic way to convert an existing DNN into a DN to create a lightweight version of the original model. DecisioNet takes the best of both worlds - it uses neural modules to perform representational learning and utilizes its tree structure to perform only a portion of the computations. We evaluate various DN architectures, along with their corresponding baseline models on the FashionMNIST, CIFAR10, and CIFAR100 datasets. We show that the DN variants achieve similar accuracy while significantly reducing the computational cost of the original network.
翻译:深神经网络(DNN)和决定树(DTs)都是最先进的神经网络(DDNS)和决定树(DTs) 。 DNS由于其代表性学习能力而表现良好,而DNS在计算上效率很高,因为他们在依赖输入数据的一个路径(root-leaf)上进行推断。在本文件中,我们介绍了一个二进制神经网络(DN),即二进制树结构网络。我们提出了一个系统的方法,将一个现有的DNN转换为DN,以创建原始模型的轻量化版本。DecisioNet利用两个世界的最好方法――它使用神经模块进行代表性学习,并利用树结构只进行一部分计算。我们评估了不同的DN结构及其在FashianMINIS、CIFAR10和CIFAR100数据集上的相应基线模型。我们显示,DN变量在显著降低原始网络的计算成本的同时,实现了相似的准确性。