This paper introduces a novel method to fine-tune handwriting recognition systems based on Recurrent Neural Networks (RNN). Long Short-Term Memory (LSTM) networks are good at modeling long sequences but they tend to overfit over time. To improve the system's ability to model sequences, we propose to drop information at random positions in the sequence. We call our approach Temporal Dropout (TD). We apply TD at the image level as well to internal network representation. We show that TD improves the results on two different datasets. Our method outperforms previous state-of-the-art on Rodrigo dataset.
翻译:本文介绍了一种基于经常性神经网络的微调笔迹识别系统的新方法。 长期短期内存(LSTM)网络在模拟长序列方面很行,但随着时间的推移往往过于完善。 为了提高系统模拟序列的能力,我们提议在随机位置按顺序投放信息。 我们称之为“ 时空漏” (TD) 。 我们在图像层面和内部网络代表中应用TD。 我们显示TD改进了两个不同数据集的结果。 我们的方法比罗德里戈数据集的前一手方法要好。