Modern neuroimaging techniques, such as diffusion tensor imaging (DTI) and functional magnetic resonance imaging (fMRI), enable us to model the human brain as a brain network or connectome. Capturing brain networks' structural information and hierarchical patterns is essential for understanding brain functions and disease states. Recently, the promising network representation learning capability of graph neural networks (GNNs) has prompted many GNN-based methods for brain network analysis to be proposed. Specifically, these methods apply feature aggregation and global pooling to convert brain network instances into meaningful low-dimensional representations used for downstream brain network analysis tasks. However, existing GNN-based methods often neglect that brain networks of different subjects may require various aggregation iterations and use GNN with a fixed number of layers to learn all brain networks. Therefore, how to fully release the potential of GNNs to promote brain network analysis is still non-trivial. To solve this problem, we propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network. Concretely, BN-GNN employs deep reinforcement learning (DRL) to train a meta-policy to automatically determine the optimal number of feature aggregations (reflected in the number of GNN layers) required for a given brain network. Extensive experiments on eight real-world brain network datasets demonstrate that our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
翻译:现代神经成像技术,如扩散高光成像(DTI)和功能性磁共振成像(fMRI)等现代神经成像技术,使我们能够将人类大脑建模成大脑网络或连接器(fMRI),从而能够将人类大脑建模成大脑网络。掌握大脑网络的结构信息和等级模式对于理解大脑功能和疾病状态至关重要。最近,有希望的图形神经网络的网络代表学习能力(GNN)促使人们提出许多基于GNN的脑网络分析方法。具体地说,这些方法应用地貌集合和全球集合,将大脑网络的事例转换成用于下游大脑网络分析任务的有意义的低维度显示器。然而,基于GNNNN采用的现有方法往往忽视,不同主题的大脑网络可能要求各种聚合迭代,并使用具有固定数层的GNNNN网络学习所有大脑网络。因此,如何充分释放GNNNN的潜力,以促进脑网络分析。为了解决这个问题,我们提议一个新的脑网络代表框架,即BNNNNNNN,为每个大脑网络寻找最佳 GNNNNM结构结构结构结构的架构。具体地,B使用深度强化的深度强化的脑网络,以自动地显示G的G的GMMMMMMMMMMMMMLMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMLMMMMMMMMMMMMMMMMMMMMLMLMLMLMLMLMLMLMLMLMLMLMDMLMLMLMLMLMDMUDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDMDM