循环神经网络(RNN)是一类人工神经网络,其中节点之间的连接沿时间序列形成有向图。 这使其表现出时间动态行为。 RNN源自前馈神经网络,可以使用其内部状态(内存)来处理可变长度的输入序列。这使得它们适用于诸如未分段的,连接的手写识别或语音识别之类的任务。

VIP内容

我们将考虑流行的神经序列处理模型(如RNN和Transformer)与形式化模型(如自动机及其变体)之间的关系。特别地,我们将讨论几种RNN的提取方法,以及通过自动机变体来理解的各种RNN体系结构之间的差异。然后我们将考虑更现代的Transformer。特别是,我们将展示它如何(不!)与现有的正式类相关,并以编程语言的形式提出另一种抽象。

https://icgi2020.lis-lab.fr/speakers/#Guillaume

成为VIP会员查看完整内容
0
37

最新论文

In this work, we propose a new model called triple-path attentive recurrent network (TPARN) for multichannel speech enhancement in the time domain. TPARN extends a single-channel dual-path network to a multichannel network by adding a third path along the spatial dimension. First, TPARN processes speech signals from all channels independently using a dual-path attentive recurrent network (ARN), which is a recurrent neural network (RNN) augmented with self-attention. Next, an ARN is introduced along the spatial dimension for spatial context aggregation. TPARN is designed as a multiple-input and multiple-output architecture to enhance all input channels simultaneously. Experimental results demonstrate the superiority of TPARN over existing state-of-the-art approaches.

0
0
下载
预览
父主题
子主题
Top