Spiking Neural Networks (SNNs) have recently emerged as a new generation of low-power deep neural networks, which is suitable to be implemented on low-power mobile/edge devices. As such devices have limited memory storage, neural pruning on SNNs has been widely explored in recent years. Most existing SNN pruning works focus on shallow SNNs (2~6 layers), however, deeper SNNs (>16 layers) are proposed by state-of-the-art SNN works, which is difficult to be compatible with the current SNN pruning work. To scale up a pruning technique towards deep SNNs, we investigate Lottery Ticket Hypothesis (LTH) which states that dense networks contain smaller subnetworks (i.e., winning tickets) that achieve comparable performance to the dense networks. Our studies on LTH reveal that the winning tickets consistently exist in deep SNNs across various datasets and architectures, providing up to 97% sparsity without huge performance degradation. However, the iterative searching process of LTH brings a huge training computational cost when combined with the multiple timesteps of SNNs. To alleviate such heavy searching cost, we propose Early-Time (ET) ticket where we find the important weight connectivity from a smaller number of timesteps. The proposed ET ticket can be seamlessly combined with a common pruning techniques for finding winning tickets, such as Iterative Magnitude Pruning (IMP) and Early-Bird (EB) tickets. Our experiment results show that the proposed ET ticket reduces search time by up to 38% compared to IMP or EB methods. Code is available at Github.
翻译:Spik Neural 网络( SNNN) 是新一代的低功深心神经网络, 适合在低功率移动/尖端设备上实施。 由于这类设备有有限的内存存储, 近些年来对 SNN 的神经运行进行了广泛探索。 大多数现有的 SNN 运行工作侧重于浅度 SNN( 2~6 层), 然而, 更深的 SNNN( > 16层) 是由最先进的 SNNN 工程提出的, 这很难与当前的 SNN 运行工作兼容。 但是, LTH 的反复搜索过程会给深度的SNNNT 带来一个巨大的速流学技术, 使LTE Ticket Hypothes( LTTH) 包含较小的子网络( 即赢票) 能够达到与密度网络的网络的类似性能。 我们对LTH的研究显示, 赢得的票源始终存在于深SNNM 中, 可以提供高达97% 的无重大性性能退化。 然而, LTHTH会带来一个巨大的联合培训计算成本, 当我们搜索时, 将比重的SNTER 大幅搜索成本的轨道显示, 重的SNWNWE- 。