Recurrent neural networks are a powerful means in diverse applications. We show that, together with so-called conceptors, they also allow fast learning, in contrast to other deep learning methods. In addition, a relatively small number of examples suffices to train neural networks with high accuracy. We demonstrate this with two applications, namely speech recognition and detecting car driving maneuvers. We improve the state of the art by application-specific preparation techniques: For speech recognition, we use mel frequency cepstral coefficients leading to a compact representation of the frequency spectra, and detecting car driving maneuvers can be done without the commonly used polynomial interpolation, as our evaluation suggests.
翻译:经常性神经网络是多种应用中的有力手段。 我们显示,与所谓的概念器一起,它们也允许快速学习,而与其他深层学习方法不同。 此外,相对较少的例子足以对神经网络进行高度精准的培训。我们用两种应用来证明这一点,即语音识别和探测汽车驾驶动作。我们通过应用特定准备技术改进了最新水平:为了语音识别,我们使用中频阴极系数,导致频率光谱的缩缩略语,而且如我们的评估所示,在不使用常用的多音内插法的情况下,可以探测汽车驾驶动作。