It has been argued by Thom and Palm that sparsely-connected neural networks (SCNs) show improved performance over fully-connected networks (FCNs). Super-regular networks (SRNs) are neural networks composed of a set of stacked sparse layers of (epsilon, delta)-super-regular pairs, and randomly permuted node order. Using the Blow-up Lemma, we prove that as a result of the individual super-regularity of each pair of layers, SRNs guarantee a number of properties that make them suitable replacements for FCNs for many tasks. These guarantees include edge uniformity across all large-enough subsets, minimum node in- and out-degree, input-output sensitivity, and the ability to embed pre-trained constructs. Indeed, SRNs have the capacity to act like FCNs, and eliminate the need for costly regularization schemes like Dropout. We show that SRNs perform similarly to X-Nets via readily reproducible experiments, and offer far greater guarantees and control over network structure.
翻译:Thom和Palm认为,连接很少的神经网络(SCNs)在完全连接的网络上表现较好。超常规网络(SRNs)是神经网络,由一组堆叠的稀疏层(epsilon, delta)-超常规配对组成,并随机地固定结点顺序。我们用吹风 Lemma来证明,由于每一层各层的超常规性,SRNs保证了一系列的特性,使得它们适合替换FCNs完成许多任务。这些保证包括所有大剂量子集的边缘统一性、最低限度的内外节点、投入-产出敏感性,以及嵌入预先训练的建筑的能力。事实上,SRNs有能力像FCNs那样采取行动,并消除像Lotout那样昂贵的正规化计划的必要性。我们证明,SRNs通过随时可复制的实验与X-Nets相似,并且对网络结构提供更大的保障和控制。