Brain networks characterize complex connectivities among brain regions as graph structures, which provide a powerful means to study brain connectomes. In recent years, graph neural networks have emerged as a prevalent paradigm of learning with structured data. However, most brain network datasets are limited in sample sizes due to the relatively high cost of data acquisition, which hinders the deep learning models from sufficient training. Inspired by meta-learning that learns new concepts fast with limited training examples, this paper studies data-efficient training strategies for analyzing brain connectomes in a cross-dataset setting. Specifically, we propose to meta-train the model on datasets of large sample sizes and transfer the knowledge to small datasets. In addition, we also explore two brain-network-oriented designs, including atlas transformation and adaptive task reweighing. Compared to other pre-training strategies, our meta-learning-based approach achieves higher and stabler performance, which demonstrates the effectiveness of our proposed solutions. The framework is also able to derive new insights regarding the similarities among datasets and diseases in a data-driven fashion.
翻译:大脑网络将大脑区域之间的复杂连接定性为图表结构,它提供了研究大脑连接的有力手段。近年来,图形神经网络已成为以结构化数据进行学习的一个普遍范例。然而,由于数据获取的成本较高,大多数脑网络数据集的样本规模有限,这妨碍了从充分培训中获得深层次的学习模式。在以有限的培训实例快速学习新概念的元学习的启发下,本文研究在交叉数据集设置中分析大脑连接点的数据效率培训战略。具体地说,我们提议对大样本规模数据集模型进行元培训,并将知识转移到小型数据集。此外,我们还探索两种面向大脑网络的设计,包括阿特拉斯转换和适应性任务调整。与其他培训前战略相比,我们的基于元学习的方法取得了更高和更稳定的性能,这显示了我们拟议解决方案的有效性。这个框架还能够从数据驱动方式中获取关于数据集和疾病之间的相似性的新见解。