Despite empirical successes of recurrent neural networks (RNNs) in natural language processing (NLP), theoretical understanding of RNNs is still limited due to intrinsically complex computations in RNNs. We perform a systematic analysis of RNNs' behaviors in a ubiquitous NLP task, the sentiment analysis of movie reviews, via the mapping between a class of RNNs called recurrent arithmetic circuits (RACs) and a matrix product state (MPS). Using the von-Neumann entanglement entropy (EE) as a proxy for information propagation, we show that single-layer RACs possess a maximum information propagation capacity, reflected by the saturation of the EE. Enlarging the bond dimension of an MPS beyond the EE saturation threshold does not increase the prediction accuracies, so a minimal model that best estimates the data statistics can be constructed. Although the saturated EE is smaller than the maximum EE achievable by the area law of an MPS, our model achieves ~99% training accuracies in realistic sentiment analysis data sets. Thus, low EE alone is not a warrant against the adoption of single-layer RACs for NLP. Contrary to a common belief that long-range information propagation is the main source of RNNs' expressiveness, we show that single-layer RACs also harness high expressiveness from meaningful word vector embeddings. Our work sheds light on the phenomenology of learning in RACs and more generally on the explainability aspects of RNNs for NLP, using tools from many-body quantum physics.
翻译:尽管自然语言处理中反复出现的神经网络(RNN)取得了经验性的成功,但是对RNN的理论理解仍然有限,这是因为在自然语言处理中进行固有的复杂计算。我们在一个无处不在的NLP任务中系统分析RNNs的行为。我们通过在被称为反复计算电路(RACs)和矩阵产品状态(MPS)的一类RNNS的绘图,对电影审查的情绪分析,对NNNNNSNs(RNNNNN)的行为进行了系统分析。使用von-Neumann的纠结导导导导(EE)作为信息传播的代名,我们显示单层RNAC具有最大的信息传播能力,这体现在E的饱和度上。因此,将MPS的债券的债券比值比值比值超过EE的最大值,我们模型在现实的情绪分析中,99%的RARC的精确度分析数据组是真实的。因此,一般的 RNRA值不是普通的单一的RAL 。