Diffusion, a fundamental internal mechanism emerging in many physical processes, describes the interaction among different objects. In many learning tasks with limited training samples, the diffusion connects the labeled and unlabeled data points and is a critical component for achieving high classification accuracy. Many existing deep learning approaches directly impose the fusion loss when training neural networks. In this work, inspired by the convection-diffusion ordinary differential equations (ODEs), we propose a novel diffusion residual network (Diff-ResNet), internally introduces diffusion into the architectures of neural networks. Under the structured data assumption, it is proved that the proposed diffusion block can increase the distance-diameter ratio that improves the separability of inter-class points and reduces the distance among local intra-class points. Moreover, this property can be easily adopted by the residual networks for constructing the separable hyperplanes. Extensive experiments of synthetic binary classification, semi-supervised graph node classification and few-shot image classification in various datasets validate the effectiveness of the proposed method.
翻译:传播是许多物理过程中出现的基本内部机制,它描述了不同物体之间的相互作用。在许多培训样本有限的学习任务中,扩散将标签和无标签的数据点连接在一起,是达到高分类准确度的一个关键组成部分。许多现有的深层学习方法在培训神经网络时直接造成聚变损失。在这项工作中,在对流扩散普通差异方程式的启发下,我们提议建立一个新的扩散残余网络(Diff-ResNet),内部将扩散引入神经网络结构。在结构化数据假设下,证明拟议的扩散块可以增加远程直径比,提高分类点之间的分离性,并缩小本地分类点之间的距离。此外,这种属性很容易被残余网络用于构建可分解的超平方程式。在各种数据集中,合成二元分类、半超强图形节点分类和少量图像分类的广泛实验证实了拟议方法的有效性。