In deep neural networks, better results can often be obtained by increasing the complexity of previously developed basic models. However, it is unclear whether there is a way to boost performance by decreasing the complexity of such models. Intuitively, given a problem, a simpler data structure comes with a simpler algorithm. Here, we investigate the feasibility of improving graph classification performance while simplifying the learning process. Inspired by structural entropy on graphs, we transform the data sample from graphs to coding trees, which is a simpler but essential structure for graph data. Furthermore, we propose a novel message passing scheme, termed hierarchical reporting, in which features are transferred from leaf nodes to root nodes by following the hierarchical structure of coding trees. We then present a tree kernel and a convolutional network to implement our scheme for graph classification. With the designed message passing scheme, the tree kernel and convolutional network have a lower runtime complexity of $O(n)$ than Weisfeiler-Lehman subtree kernel and other graph neural networks of at least $O(hm)$. We empirically validate our methods with several graph classification benchmarks and demonstrate that they achieve better performance and lower computational consumption than competing approaches.
翻译:在深神经网络中,通过增加先前开发的基本模型的复杂性,往往可以取得更好的结果。然而,尚不清楚是否有一种方法通过降低这些模型的复杂性来提高性能。由于存在一个问题,一个更简单的数据结构随一种更简单的算法而出现。在这里,我们调查了在简化学习过程的同时提高图形分类性能的可行性。在图形结构的催化下,我们将数据样本从图形转换为编码树,这是图形数据的一个简单但必要的结构。此外,我们提出了一个新的信息传递计划,称为等级报告,其中将功能从叶节转移到根节点,遵循编码树的等级结构。我们随后提出了一个树骨心和一个革命网络来实施我们的图表分类计划。根据设计的信息传递计划,树骨和革命网络的运行复杂性比Weisfeiler-Lehman亚树内核内核和其他直线神经网络的运行时间要低,比Weisfeiler-Lehman至少为$O(h)美元。我们用几个图表分类方法验证了我们的方法,并展示了它们达到更高程度的计算方法。