Recurrent neural networks (RNNs) are well suited for solving sequence tasks in resource-constrained systems due to their expressivity and low computational requirements. However, there is still a need to bridge the gap between what RNNs are capable of in terms of efficiency and performance and real-world application requirements. The memory and computational requirements arising from propagating the activations of all the neurons at every time step to every connected neuron, together with the sequential dependence of activations, contribute to the inefficiency of training and using RNNs. We propose a solution inspired by biological neuron dynamics that makes the communication between RNN units sparse and discrete. This makes the backward pass with backpropagation through time (BPTT) computationally sparse and efficient as well. We base our model on the gated recurrent unit (GRU), extending it with units that emit discrete events for communication triggered by a threshold so that no information is communicated to other units in the absence of events. We show theoretically that the communication between units, and hence the computation required for both the forward and backward passes, scales with the number of events in the network. Our model achieves efficiency without compromising task performance, demonstrating competitive performance compared to state-of-the-art recurrent network models in real-world tasks, including language modeling. The dynamic activity sparsity mechanism also makes our model well suited for novel energy-efficient neuromorphic hardware. Code is available at https://github.com/KhaleelKhan/EvNN/.
翻译:经常神经网络(RNNS)非常适合解决资源紧张系统中的序列任务,原因是其表现性和计算要求低,然而,仍然需要弥合区域神经网络在效率和性能以及现实世界应用要求方面所能达到的程度之间的差距。由于每次向每个连接的神经元推广所有神经元的激活,以及启动的依次依赖性,记忆和计算要求导致培训和使用RNNS的效率低下。我们提出了一个由生物神经动态所启发的解决方案,使得区域神经网络单元之间的通信分散和离散。这使得通过时间(BPTTT)进行反向调整的后向通路十分分散和高效。我们把我们的模型建立在封闭的经常性单元(GRU)上,通过阈值将分散的通信事件扩展到每个单元,这样就不会在没有模型的情况下将信息传递给其他单位。我们从理论上看,各单元之间的通信,从而计算出前向和后向的硬通路、与网络上的良好事件数量。我们模型的SDV-RO-S-RODS-RO-RODS-RODS-RODS-ROD RANS-ROD ROD ROD ROD ROD-IAL-IL-IL-IL-ID-IL-IOL-IL-ID-ID-ID-ID-ID-ID-ILVLVD-ID-ID-ID-ID-ID-ID-S-ID-ID-ID-ID-ID-ID-ID-ID-ILVOLOLOLOLOL-ID-ID-ID-ID-ID-ID-ID-ID-S-ID-ID-ID-ID-ID-ID-ID-I-I-ID-I-I-I-I-I-I-ID-ID-S-ID-S-ID-ID-ID-ID-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-ID-ID-I-I-I-I-I-I-I-I-I-I-</s>