Long short-term memory (LSTM) is one of the robust recurrent neural network architectures for learning sequential data. However, it requires considerable computational power to learn and implement both software and hardware aspects. This paper proposed a novel LiteLSTM architecture based on reducing the LSTM computation components via the weights sharing concept to reduce the overall architecture computation cost and maintain the architecture performance. The proposed LiteLSTM can be significant for processing large data where time-consuming is crucial while hardware resources are limited, such as the security of IoT devices and medical data processing. The proposed model was evaluated and tested empirically on three different datasets from the computer vision, cybersecurity, speech emotion recognition domains. The proposed LiteLSTM has comparable accuracy to the other state-of-the-art recurrent architecture while using a smaller computation budget.
翻译:长期短期内存(LSTM)是用于学习相继数据的坚固的经常性神经网络结构之一,然而,它需要相当的计算能力来学习和实施软件和硬件方面,本文件提议了一个新的LiteLSTM结构,其基础是通过权重共享概念减少LSTM计算组件,以减少总体结构计算成本并保持结构性能。拟议的LiteLSTM对于处理大型数据可能具有重要意义,因为耗费时间而硬件资源有限,例如IoT装置和医疗数据处理的安全性。拟议的模型在计算机视觉、网络安全、语音识别三个不同的数据集上进行了经验性评估和测试。拟议的LiteLSTM在使用较小的计算预算的同时,与其他最先进的经常结构具有可比性。