Associative memory plays an important role in human intelligence and its mechanisms have been linked to attention in machine learning. While the machine learning community's interest in associative memories has recently been rekindled, most work has focused on memory recall ($read$) over memory learning ($write$). In this paper, we present BayesPCN, a hierarchical associative memory capable of performing continual one-shot memory writes without meta-learning. Moreover, BayesPCN is able to gradually forget past observations ($forget$) to free its memory. Experiments show that BayesPCN can recall corrupted i.i.d. high-dimensional data observed hundreds of "timesteps" ago without a significant drop in recall ability compared to the state-of-the-art offline-learned associative memory models.
翻译:联合记忆在人类智能中起着重要作用,其机制也与机器学习中的注意力联系在一起。虽然机器学习界对联合记忆的兴趣最近重新燃起,但大部分工作的重点是记忆学习(美元 ) 的记忆回忆(美元 ) 。 在本文中,我们介绍了BayesPCN, 这是一种分级联系记忆,可以进行连续一发记忆而不用元学习。此外,BayesPCN可以逐渐忘记过去观测(美元 ) 来释放记忆。实验显示,BayesPCN可以回忆腐败的i.d. 高维数据,先前观察到数百个“时间步骤 ”, 与最先进的离线联系记忆模型相比,回溯能力没有显著下降。