Reasoning and question answering as a basic cognitive function for humans, is nevertheless a great challenge for current artificial intelligence. Although the Differentiable Neural Computer (DNC) model could solve such problems to a certain extent, the development is still limited by its high algorithm complexity, slow convergence speed, and poor test robustness. Inspired by the learning and memory mechanism of the brain, this paper proposed a Memory Transformation based Differentiable Neural Computer (MT-DNC) model. MT-DNC incorporates working memory and long-term memory into DNC, and realizes the autonomous transformation of acquired experience between working memory and long-term memory, thereby helping to effectively extract acquired knowledge to improve reasoning ability. Experimental results on bAbI question answering task demonstrated that our proposed method achieves superior performance and faster convergence speed compared to other existing DNN and DNC models. Ablation studies also indicated that the memory transformation from working memory to long-term memory plays essential role in improving the robustness and stability of reasoning. This work explores how brain-inspired memory transformation can be integrated and applied to complex intelligent dialogue and reasoning systems.
翻译:虽然不同的神经计算机模型可以在某种程度上解决这些问题,但这种开发仍然受到其高算法复杂性、慢趋同速度和低测试强度的限制。在大脑的学习和记忆机制的启发下,本文件提出了基于记忆转化的有差异神经计算机(MT-DNC)模型。MT-DNC将工作记忆和长期记忆纳入DNC,并实现工作记忆和长期记忆之间所积累的经验的自主转化,从而有助于有效提取获得的知识,以提高推理能力。关于bAbI问题回答的实验结果表明,我们拟议的方法与其他现有的DNN和DNC模型相比,业绩优异,趋同速度更快。吸收研究还表明,从工作记忆到长期记忆的转变在改进说服力和理性稳定性方面发挥着至关重要的作用。这项工作探索了如何将大脑启发记忆转换整合起来,并应用于复杂的智能对话和推理系统。