Dense Associative Memories or modern Hopfield networks permit storage and reliable retrieval of an exponentially large (in the dimension of feature space) number of memories. At the same time, their naive implementation is non-biological, since it seemingly requires the existence of many-body synaptic junctions between the neurons. We show that these models are effective descriptions of a more microscopic (written in terms of biological degrees of freedom) theory that has additional (hidden) neurons and only requires two-body interactions between them. For this reason our proposed microscopic theory is a valid model of large associative memory with a degree of biological plausibility. The dynamics of our network and its reduced dimensional equivalent both minimize energy (Lyapunov) functions. When certain dynamical variables (hidden neurons) are integrated out from our microscopic theory, one can recover many of the models that were previously discussed in the literature, e.g. the model presented in "Hopfield Networks is All You Need" paper. We also provide an alternative derivation of the energy function and the update rule proposed in the aforementioned paper and clarify the relationships between various models of this class.
翻译:神经神经元之间的许多身体合成交叉点似乎需要存在这些神经元之间的许多身体合成连接点,因此,这些模型有效地描述了一种更微观的(以生物自由度写成的)理论,这种理论具有额外的(隐藏的)神经元,只需要它们之间的双体互动。因此,我们提议的微观科学理论是具有某种程度的生物可视性的大型联系记忆的有效模型。我们网络的动态及其减少的维等值的动态,既能最大限度地减少能量(Lyapunov)功能。当某些动态变量(隐藏的神经元)从我们的微观理论中整合出来时,人们可以恢复以前在文献中讨论过的许多模型,例如,“Hopfield Networks is All You need” 文件中介绍的模型。我们还提供了能源功能的替代衍生法和上述文件中的拟议规则的更新,并澄清了各种模型之间的关系。