In this study, we explore the impact of network topology on the approximation capabilities of artificial neural networks (ANNs), with a particular focus on complex topologies. We propose a novel methodology for constructing complex ANNs based on various topologies, including Barab\'asi-Albert, Erd\H{o}s-R\'enyi, Watts-Strogatz, and multilayer perceptrons (MLPs). The constructed networks are evaluated on synthetic datasets generated from manifold learning generators, with varying levels of task difficulty and noise. Our findings reveal that complex topologies lead to superior performance in high-difficulty regimes compared to traditional MLPs. This performance advantage is attributed to the ability of complex networks to exploit the compositionality of the underlying target function. However, this benefit comes at the cost of increased forward-pass computation time and reduced robustness to graph damage. Additionally, we investigate the relationship between various topological attributes and model performance. Our analysis shows that no single attribute can account for the observed performance differences, suggesting that the influence of network topology on approximation capabilities may be more intricate than a simple correlation with individual topological attributes. Our study sheds light on the potential of complex topologies for enhancing the performance of ANNs and provides a foundation for future research exploring the interplay between multiple topological attributes and their impact on model performance.
翻译:在这项研究中,我们探究了神经网络拓扑结构对人工神经网络(ANN)逼近能力的影响,特别关注了复杂拓扑。我们提出了一种基于各种拓扑结构(包括Barab\'asi-Albert、Erd\H{o}s-R\'enyi、Watts-Strogatz和多层感知机(MLP))构建复杂ANN的新方法。使用流形学习数据生成器生成了各种难度和噪音级别的合成数据集进行评估。我们的研究发现,在高难度任务中,复杂拓扑结构比传统的MLP表现更优异。复杂网络能够充分利用底层目标函数的复合性,从而产生性能优势。然而,这种优势的代价包括增加前向传递计算时间和降低对图形损坏的鲁棒性。此外,我们还研究了各种拓扑特性与模型性能之间的关系。我们的分析表明,没有单一的特性可以解释观察到的性能差异,这表明网络拓扑结构对逼近能力的影响可能比单独一个拓扑特性的简单相关更为复杂。我们的研究揭示了复杂拓扑结构增强ANN性能的潜力,并为未来探索多个拓扑特性之间相互作用及其对模型性能的影响提供了基础。