We present LTC-SE, an improved version of the Liquid Time-Constant (LTC) neural network algorithm originally proposed by Hasani et al. in 2021. This algorithm unifies the Leaky-Integrate-and-Fire (LIF) spiking neural network model with Continuous-Time Recurrent Neural Networks (CTRNNs), Neural Ordinary Differential Equations (NODEs), and bespoke Gated Recurrent Units (GRUs). The enhancements in LTC-SE focus on augmenting flexibility, compatibility, and code organization, targeting the unique constraints of embedded systems with limited computational resources and strict performance requirements. The updated code serves as a consolidated class library compatible with TensorFlow 2.x, offering comprehensive configuration options for LTCCell, CTRNN, NODE, and CTGRU classes. We evaluate LTC-SE against its predecessors, showcasing the advantages of our optimizations in user experience, Keras function compatibility, and code clarity. These refinements expand the applicability of liquid neural networks in diverse machine learning tasks, such as robotics, causality analysis, and time-series prediction, and build on the foundational work of Hasani et al.
翻译:我们提出了LTC-SE,这是Hasani等人于2021年首次提出的液态时常(LTC)神经网络算法的改进版本。该算法将Leaky-Integrate-and-Fire(LIF)脉冲神经网络模型与Continuous-Time Recurrent Neural Networks(CTRNNs)、Neural Ordinary Differential Equations(NODEs)和独特的Gated Recurrent Units(GRUs)相结合。LTC-SE的增强功能集中在增强灵活性、兼容性和代码组织上,以满足嵌入式系统的独特限制,包括有限的计算资源和严格的性能要求。更新后的代码作为一个整合的类库,与TensorFlow 2.x兼容,为LTCCell、CTRNN、NODE和CTGRU类提供了全面的配置选项。我们对LTC-SE与其前身进行了评估,并展示了我们优化在用户体验、Keras函数兼容性和代码清晰度方面的优势。这些改进扩展了液态神经网络在多种机器学习任务中的适用性,例如机器人技术、因果分析和时间序列预测,并建立在Hasani等人的基础工作之上。