Dense Associative Memories or Modern Hopfield Networks have many appealing properties of associative memory. They can do pattern completion, store a large number of memories, and can be described using a recurrent neural network with a degree of biological plausibility and rich feedback between the neurons. At the same time, up until now all the models of this class have had only one hidden layer, and have only been formulated with densely connected network architectures, two aspects that hinder their machine learning applications. This paper tackles this gap and describes a fully recurrent model of associative memory with an arbitrary large number of layers, some of which can be locally connected (convolutional), and a corresponding energy function that decreases on the dynamical trajectory of the neurons' activations. The memories of the full network are dynamically "assembled" using primitives encoded in the synaptic weights of the lower layers, with the "assembling rules" encoded in the synaptic weights of the higher layers. In addition to the bottom-up propagation of information, typical of commonly used feedforward neural networks, the model described has rich top-down feedback from higher layers that help the lower-layer neurons to decide on their response to the input stimuli.
翻译:共振感应器或现代Hopfield 网络有许多关联内存的吸引性特性。 它们可以完成模式, 存储大量记忆, 并使用神经元之间具有一定程度的生物光度和丰富反馈的经常性神经网络进行描述。 与此同时, 直到现在为止, 此类的所有模型都只有一个隐藏的层, 并且只是与密集连接的网络结构形成, 两个阻碍其机器学习应用的方面。 本文解决了这一差距, 描述一个完全重复的连接内存模式, 与任意大量的层连接, 其中的一些层可以本地连接( 进化), 以及相应的能量功能, 减少神经元激活的动态轨迹。 整个网络的记忆是动态的“ 集合 ”, 使用下层合成权重中编码的原始数据, 以及高层合成权重中编码的“ 组合规则 ” 。 除了自下层信息传播, 通常用于向下层神经元网络反馈的典型信息外, 整个网络的能量功能功能功能功能功能会降低。 模型描述的神经元层的自上层到上层的上层反馈。