Recent years have witnessed the popularity and success of graph neural networks (GNN) in various scenarios. To obtain data-specific GNN architectures, researchers turn to neural architecture search (NAS), which has made impressive success in discovering effective architectures in convolutional neural networks. However, it is non-trivial to apply NAS approaches to GNN due to challenges in search space design and the expensive searching cost of existing NAS methods. In this work, to obtain the data-specific GNN architectures and address the computational challenges facing by NAS approaches, we propose a framework, which tries to Search to Aggregate NEighborhood (SANE), to automatically design data-specific GNN architectures. By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods. Experimental results on four tasks and seven real-world datasets demonstrate the superiority of SANE compared to existing GNN models and NAS approaches in terms of effectiveness and efficiency. (Code is available at: https://github.com/AutoML-4Paradigm/SANE).
翻译:近些年来,由于在搜索空间设计方面的挑战和现有NAS方法的昂贵搜索成本,将NAS方法应用于GNS方法并不难。在这项工作中,为了获得数据专用GNN结构,并解决NAS方法所面临的计算挑战,我们提议了一个框架,试图搜索集成的Neghborhood(SANE),自动设计数据专用GNN结构。通过设计新的和清晰的搜索空间,我们提出了一个不同的搜索算法,比以前基于强化学习的方法效率更高。四项任务和七个实际世界数据集的实验结果显示了SANE相对于现有的GNN模型和NAS方法在效力和效率方面的优势(Code可查到: https://github.com/AutomML-4Pagim/SANE)。