We introduce organism networks, which function like a single neural network but are composed of several neural particle networks; while each particle network fulfils the role of a single weight application within the organism network, it is also trained to self-replicate its own weights. As organism networks feature vastly more parameters than simpler architectures, we perform our initial experiments on an arithmetic task as well as on simplified MNIST-dataset classification as a collective. We observe that individual particle networks tend to specialise in either of the tasks and that the ones fully specialised in the secondary task may be dropped from the network without hindering the computational accuracy of the primary task. This leads to the discovery of a novel pruning-strategy for sparse neural networks
翻译:我们引入有机体网络,这些网络像单一神经网络一样运作,但由若干神经粒子网络组成;每个粒子网络在有机体网络中都发挥单一重量应用的作用,但也接受过自我复制自身重量的培训;由于有机体网络的参数比简单的结构多得多,因此我们在算术任务和简化MNIST-数据集分类方面进行初步实验,作为一个集体。我们观察到,单个粒子网络往往专门从事任一任务,完全专门从事次要任务的粒子网络可以在不妨碍主要任务的计算准确性的情况下从网络中丢弃。这导致发现了稀有神经网络的新型运行战略。</s>