Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used on NLP tasks to capture the long-term and local dependencies, respectively. Attention mechanisms have recently attracted enormous interest due to their highly parallelizable computation, significantly less training time, and flexibility in modeling dependencies. We propose a novel attention mechanism in which the attention between elements from input sequence(s) is directional and multi-dimensional (i.e., feature-wise). A light-weight neural net, "Directional Self-Attention Network (DiSAN)", is then proposed to learn sentence embedding, based solely on the proposed attention without any RNN/CNN structure. DiSAN is only composed of a directional self-attention with temporal order encoded, followed by a multi-dimensional attention that compresses the sequence into a vector representation. Despite its simple form, DiSAN outperforms complicated RNN models on both prediction quality and time efficiency. It achieves the best test accuracy among all sentence encoding methods and improves the most recent best result by 1.02% on the Stanford Natural Language Inference (SNLI) dataset, and shows state-of-the-art test accuracy on the Stanford Sentiment Treebank (SST), Multi-Genre natural language inference (MultiNLI), Sentences Involving Compositional Knowledge (SICK), Customer Review, MPQA, TREC question-type classification and Subjectivity (SUBJ) datasets.
翻译:经常神经网(RNN)和进化神经网(CNN)被广泛用于 NLP 的任务,以捕捉长期和地方依赖性。关注机制最近由于高度平行的计算、培训时间的减少和建模依赖性的灵活性而引起极大的兴趣。我们建议一个新的关注机制,输入序列元素之间的注意是方向性的和多维的(即,特征方面),轻量神经网“自控网络(DISAN)”,然后建议仅仅根据拟议的注意而不设置RNN/CNN结构来学习嵌入的句子。DISAN只是由带有时间顺序编码的指导性自我注意组成,然后是将序列压缩成矢量代表的多维关注。尽管其形式简单,DISAN在预测质量和时间效率方面都超越了复杂的 RNNN模型。它实现了所有句编码方法的最佳测试准确性,并将最新的最佳结果提高至1.02 %的RNNNNNNNNNC-S-S-S-CRAB IMS-Calnialal-Calnial-Cal-Calnialnial-Cal-Cal-Cal-Callial-Cal-Cal-Ial-Cal-Cal-Cal-Cal-Cal-CRisal-CRisal-CRIDRisal-CRisal-CRIDIDS-S-S-CS-S-S-S-S-S-S-S-S-S-CRIDIDIDIDIDIDIDIDIDIDRIDID-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-IDIDIDIDIDIDIDIDBIDRIDAR-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-