In this paper, we interpret Deep Neural Networks with Complex Network Theory. Complex Network Theory (CNT) represents Deep Neural Networks (DNNs) as directed weighted graphs to study them as dynamical systems. We efficiently adapt CNT measures to examine the evolution of the learning process of DNNs with different initializations and architectures: we introduce metrics for nodes/neurons and layers, namely Nodes Strength and Layers Fluctuation. Our framework distills trends in the learning dynamics and separates low from high accurate networks. We characterize populations of neural networks (ensemble analysis) and single instances (individual analysis). We tackle standard problems of image recognition, for which we show that specific learning dynamics are indistinguishable when analysed through the solely Link-Weights analysis. Further, Nodes Strength and Layers Fluctuations make unprecedented behaviours emerge: accurate networks, when compared to under-trained models, show substantially divergent distributions with the greater extremity of deviations. On top of this study, we provide an efficient implementation of the CNT metrics for both Convolutional and Fully Connected Networks, to fasten the research in this direction.
翻译:在本文中,我们用复杂的网络理论来解释深神经网络。复杂的网络理论(CNT)代表深神经网络(DNNS),作为定向加权图表,研究它们作为动态系统。我们有效地调整CNT措施,以审查带有不同初始化和结构的DNN的学习过程的演变情况:我们引入节点/中和层的测量标准,即节点力量和图层波动。我们的框架蒸发了学习动态的趋势,与高精度网络分解得很低。我们描述神经网络(联合分析)和单一实例(个人分析)的人口特征。我们处理图像识别的标准问题,为此我们表明,在仅仅通过链接-视觉分析分析来分析DNNNT的学习过程时,具体学习动态是不可分化的。此外,节点力量和图层结构使得前所未有的行为出现:与受培训不足的模式相比,准确的网络显示差异很大,偏离率更大。除了这项研究之外,我们还为革命和完全连接网络的研究方向提供了高效的CNT测量数据。