Kernels on graphs have had limited options for node-level problems. To address this, we present a novel, generalized kernel for graphs with node feature data for semi-supervised learning. The kernel is derived from a regularization framework by treating the graph and feature data as two Hilbert spaces. We also show how numerous kernel-based models on graphs are instances of our design. A kernel defined this way has transductive properties, and this leads to improved ability to learn on fewer training points, as well as better handling of highly non-Euclidean data. We demonstrate these advantages using synthetic data where the distribution of the whole graph can inform the pattern of the labels. Finally, by utilizing a flexible polynomial of the graph Laplacian within the kernel, the model also performed effectively in semi-supervised classification on graphs of various levels of homophily.
翻译:图形的内核对于节点问题的选择有限。 为了解决这个问题, 我们为带有节点特征数据的图形展示了一个新的、 通用的内核, 供半监督学习使用。 内核来自一个正规化框架, 将图形和特征数据作为两个 Hilbert 空间处理 。 我们还展示了图形上无数基于内核的模型是如何设计我们设计的。 以这种方式定义的内核具有感应特性, 这导致提高在较少的培训点上学习的能力, 以及更好地处理高度非欧裔数据。 我们用合成数据展示了这些优势, 使整个图的分布能够为标签模式提供参考。 最后, 通过在内核内核内使用一个灵活多功能的图形色素模型, 该模型还在不同层次的图形的半监督分类中有效运行 。