Spiking Neural Networks (SNNs) have the potential for achieving low energy consumption due to their biologically sparse computation. Several studies have shown that the off-chip memory (DRAM) accesses are the most energy-consuming operations in SNN processing. However, state-of-the-art in SNN systems do not optimize the DRAM energy-per-access, thereby hindering achieving high energy-efficiency. To substantially minimize the DRAM energy-per-access, a key knob is to reduce the DRAM supply voltage but this may lead to DRAM errors (i.e., the so-called approximate DRAM). Towards this, we propose SparkXD, a novel framework that provides a comprehensive conjoint solution for resilient and energy-efficient SNN inference using low-power DRAMs subjected to voltage-induced errors. The key mechanisms of SparkXD are: (1) improving the SNN error tolerance through fault-aware training that considers bit errors from approximate DRAM, (2) analyzing the error tolerance of the improved SNN model to find the maximum tolerable bit error rate (BER) that meets the targeted accuracy constraint, and (3) energy-efficient DRAM data mapping for the resilient SNN model that maps the weights in the appropriate DRAM location to minimize the DRAM access energy. Through these mechanisms, SparkXD mitigates the negative impact of DRAM (approximation) errors, and provides the required accuracy. The experimental results show that, for a target accuracy within 1% of the baseline design (i.e., SNN without DRAM errors), SparkXD reduces the DRAM energy by ca. 40% on average across different network sizes.
翻译:Spik Spik Neal 网络( SNN) 由于其生物稀释计算方法,有可能实现低耗能耗。 几项研究显示, 离芯内存(DRAM)访问是SNN处理中最耗能的操作。 但是, SNN 系统中最先进的技术并不优化 DRAM 的能源每存取,从而阻碍实现高能效。 为大幅降低 DRAM 的能源每存取率,一个关键 knob 将降低 DRAM 供应量的精确度错误,但这可能导致 DRAM 错误( 即所谓的近似 DRAM ) 。 为此,我们提议 SparkXD 访问, 这是一个新颖的框架,它为适应和节能的 Snational- NAM 提供综合解决方案, 低功率 DRAM 的每存取率, 关键机制是:(1) 通过错觉培训改进 SNNNW 差容忍度, 将 DRAM 接近 DRAM 的点差错率分析 SND 模型的错误容忍度, 将 SNND Drestal Drational dream dream dream dreal dal dream the the dreg the the dal dal deal deminal deminal deminal deminal degal degal deminal deminal deminaldaldaldald drogdddddd drom drom drom the the the the the thes thes the the drogationdaldaldald drogationddddddaldaldalddddddddddddddddaldaldddddddalddddddddddddddddddddddddddddddddddddaldaldaldaldaldaldaldaldaldddddddddddddddalddddddddalddddaldaldaldalddddaldaldaldddddddddddddd