Long short-term memory (LSTM) is a robust recurrent neural network architecture for learning spatiotemporal sequential data. However, it requires significant computational power for learning and implementing from both software and hardware aspects. This paper proposes a novel LiteLSTM architecture based on reducing the computation components of the LSTM using the weights sharing concept to reduce the overall architecture cost and maintain the architecture performance. The proposed LiteLSTM can be significant for learning big data where time-consumption is crucial such as the security of IoT devices and medical data. Moreover, it helps to reduce the CO2 footprint. The proposed model was evaluated and tested empirically on two different datasets from computer vision and cybersecurity domains.
翻译:长期短期内存(LSTM)是一个坚实的经常性神经网络结构,用于学习时空相相继数据,但需要大量的计算能力,以便从软件和硬件两方面学习和实施,本文件提出一个新的LiteLSTM结构,其基础是利用权重共享概念减少LSTM的计算组件,以减少总体结构成本并保持结构性能。拟议的LiteLSTM对于学习大数据很重要,因为时间消耗对于诸如Iot装置和医疗数据的安全至关重要。此外,它有助于减少CO2足迹。在计算机视野和网络安全领域的两个不同的数据集上,对拟议的模型进行了评估和经验测试。