Kernel-based learning methods can dramatically increase the storage capacity of Hopfield networks, yet the dynamical mechanisms behind this enhancement remain poorly understood. We address this gap by combining a geometric characterization of the attractor landscape with the spectral theory of kernel machines. Using a novel metric, Pinnacle Sharpness, we empirically uncover a rich phase diagram of attractor stability, identifying a Ridge of Optimization where the network achieves maximal robustness under high-load conditions. Phenomenologically, this ridge is characterized by a Force Antagonism, in which a strong driving force is counterbalanced by a collective feedback force. We theoretically interpret this behavior as a consequence of a specific reorganization of the weight spectrum, which we term Spectral Concentration. Unlike a simple rank-1 collapse, our analysis shows that the network on the ridge self-organizes into a critical regime: the leading eigenvalue is amplified to enhance global stability (Direct Force), while the trailing eigenvalues remain finite to sustain high memory capacity (Indirect Force). Together, these results suggest a spectral mechanism by which learning reconciles stability and capacity in high-dimensional associative memory models.
翻译:基于核的学习方法能显著提升Hopfield网络的存储容量,但其背后的动力学增强机制仍不甚明晰。本文通过结合吸引子景观的几何刻画与核机器的谱理论来填补这一空白。利用一种新颖的度量——峰锐度,我们通过实证揭示了一个丰富的吸引子稳定性相图,识别出一个优化脊,在该脊上网络能在高负载条件下实现最大的鲁棒性。从现象学上看,该脊的特征表现为一种力拮抗,即强大的驱动力被集体反馈力所平衡。我们从理论上将此行为解释为权重谱特定重组的结果,并将其称为谱集中。与简单的秩-1塌缩不同,我们的分析表明,位于脊上的网络会自组织到一个临界区域:主导特征值被放大以增强全局稳定性(直接力),而后续特征值保持有限以维持高记忆容量(间接力)。这些结果共同揭示了一种谱机制,通过该机制,学习在高维联想记忆模型中实现了稳定性与容量的统一。