Spiking Neural Networks (SNNs) have recently emerged as a new generation of low-power deep neural networks where binary spikes convey information across multiple timesteps. Pruning for SNNs is highly important as they become deployed on a resource-constraint mobile/edge device. The previous SNN pruning works focus on shallow SNNs (2~6 layers), however, deeper SNNs (>16 layers) are proposed by state-of-the-art SNN works, which is difficult to be compatible with the current pruning work. To scale up a pruning technique toward deep SNNs, we investigate Lottery Ticket Hypothesis (LTH) which states that dense networks contain smaller subnetworks (i.e., winning tickets) that achieve comparable performance to the dense networks. Our studies on LTH reveal that the winning tickets consistently exist in deep SNNs across various datasets and architectures, providing up to 97% sparsity without huge performance degradation. However, the iterative searching process of LTH brings a huge training computational cost when combined with the multiple timesteps of SNNs. To alleviate such heavy searching cost, we propose Early-Time (ET) ticket where we find the important weight connectivity from a smaller number of timesteps. The proposed ET ticket can be seamlessly combined with common pruning techniques for finding winning tickets, such as Iterative Magnitude Pruning (IMP) and Early-Bird (EB) tickets. Our experiment results show that the proposed ET ticket reduces search time by up to 38% compared to IMP or EB methods.
翻译:Spik Spik Neal 网络( SNN) 是一个新一代的低功率深心神经网络, 其二进制的螺旋钉通过多个时间步骤传递信息。 Snus Prining for Snations 是非常重要的。 以前的 SNN 运行工作侧重于浅的 SNN( 2~ 6 层 ), 然而, 更深的 SNNS ( > 16 层 ) 是由最先进的 SNNW 工程提出的, 很难与当前的运行工作兼容。 要将一个双轨技术推广到深度的 SNNNS, 我们调查 Lottery Ticket Hypothesis (LTH), 它指出, 稠密的网络包含较小的子网络( 即赢票), 与密度网络网络相当的网络运行( IMNB ) 相比, 快速搜索过程可以带来巨大的培训计算成本, 并且比重的 RERT IM 更慢的时间 。