Complex numbers have long been favoured for digital signal processing, yet complex representations rarely appear in deep learning architectures. RNNs, widely used to process time series and sequence information, could greatly benefit from complex representations. We present a novel complex gate recurrent cell. When used together with norm-preserving state transition matrices, our complex gated RNN exhibits excellent stability and convergence properties. We demonstrate competitive performance of our complex gated RNN on the synthetic memory and adding task, as well as on the real-world task of human motion prediction.
翻译:长期以来,数字信号处理一直偏好复杂数字,但深层学习结构中很少出现复杂的表现。 广泛用于处理时间序列和序列信息的 RNNs 可以从复杂的表述中大有裨益。 我们展示了一个新的复杂大门重复的单元格。 当与规范保护国家过渡矩阵一起使用时,我们复杂的门锁RNN具有极好的稳定性和趋同性。 我们在合成记忆和添加任务以及人类运动预测的实际任务上展示了我们复杂的门锁RNN的竞争性表现。