Long Short-Term Memory (LSTM) recurrent networks are frequently used for tasks involving time-sequential data such as speech recognition. Unlike previous LSTM accelerators that either exploit spatial weight sparsity or temporal activation sparsity, this paper proposes a new accelerator called "Spartus" that exploits spatio-temporal sparsity to achieve ultra-low latency inference. Spatial sparsity is induced using a new Column-Balanced Targeted Dropout (CBTD) structured pruning method, producing structured sparse weight matrices for a balanced workload. The pruned networks running on Spartus hardware achieve weight sparsity levels of up to 96% and 94% with negligible accuracy loss on the TIMIT and the Librispeech datasets. To induce temporal sparsity in LSTM, we extend the previous DeltaGRU method to the DeltaLSTM method. Combining spatio-temporal sparsity with CBTD and DeltaLSTM saves on weight memory access and associated arithmetic operations. The Spartus architecture is scalable and supports real-time online speech recognition when implemented on small and large FPGAs. Spartus per-sample latency for a single DeltaLSTM layer of 1024 neurons averages 1 us. Exploiting spatio-temporal sparsity on our test LSTM network using the TIMIT dataset leads to 46X speedup of Spartus over its theoretical hardware performance to achieve 9.4 TOp/s effective batch-1 throughput and 1.1 TOp/s/W power efficiency.
翻译:长期内存(LSTM) 常规网络经常用于包含时间序列数据的任务,如语音识别。 与以往的 LSTM 加速器不同, 前者利用空间重量宽度或时间活度宽度, 本文提议了一个新的加速器, 名为“ 出入口”, 利用时空空间宽度实现超低拉拉推力。 空间宽度的导出使用新的列- 双向定向下降( CBTD) 结构裁剪法, 产生结构化的稀薄重量矩阵, 以平衡工作量。 运行在Searks 硬件上的剪裁网络的重量宽度高达96%和94%, 在TIMIT 和 Librispeech 数据集的精度损失很小。 为了在LSTM 中引入时, 我们将以前的DeltaGRU方法推广到DLSTM 方法。 将空间时空时空透度调调音量与 CBTD 和 DelTSTMTM 节省了重量存取和相关的计算操作。 在46- Sleal- Stal- Seral Seral Seral Syal Syal Seral Syal Seral 上, 通过S- sal Fal Fal- sal- sal- sal- sal- sal- sal- sal- sal- sal- supal- supal- supal- sal- supal AS AS 10 AS- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- aalalalalalal AS AS ASalalmentalmentalmentalalalmentalalalal- a AS AS- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal-al-al- sal- sal