Deep Neural Networks are, from a physical perspective, graphs whose `links` and `vertices` iteratively process data and solve tasks sub-optimally. We use Complex Network Theory (CNT) to represents Deep Neural Networks (DNNs) as directed weighted graphs: within this framework, we introduce metrics to study DNNs as dynamical systems, with a granularity that spans from weights to layers, including neurons. CNT discriminates networks that differ in the number of parameters and neurons, the type of hidden layers and activations, and the objective task. We further show that our metrics discriminate low vs. high performing networks. CNT is a comprehensive method to reason about DNNs and a complementary approach to explain a model's behavior that is physically grounded to networks theory and goes beyond the well-studied input-output relation.
翻译:从物理角度看,深神经网络是其`链路'和`脊柱'的图表,其`链路'和`螺旋'的迭代过程数据和分极任务。我们使用复杂的网络理论(CNT)代表深神经网络(DNNs),作为定向加权图:在此框架内,我们引入了指标,将DNS作为动态系统进行研究,其颗粒从重量到层,包括神经元。CNT区分了在参数和神经数、隐藏层和激活类型以及客观任务方面各不相同的网络。我们进一步表明,我们的衡量标准区分低运行率网络和高运行网络。CNT是解释DNs的全面方法,是解释模型行为的一种补充方法,这种模型以网络理论为物理基础,超越了经过仔细研究的投入-产出关系。