We consider dense, associative neural-networks trained with no supervision and we investigate their computational capabilities analytically, via a statistical-mechanics approach, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as the quality and quantity of the training dataset and the network storage, valid in the limit of large network size and structureless datasets. Moreover, we establish a bridge between macroscopic observables standardly used in statistical mechanics and loss functions typically used in the machine learning. As technical remarks, from the analytic side, we implement large deviations and stability analysis within Guerra's interpolation to tackle the not-Gaussian distributions involved in the post-synaptic potentials while, from the computational counterpart, we insert Plefka approximation in the Monte Carlo scheme, to speed up the evaluation of the synaptic tensors, overall obtaining a novel and broad approach to investigate neural networks in general.
翻译:我们认为,没有监督的密集、连锁神经网络是训练有素的,我们通过统计-机械学方法分析其计算能力,并通过蒙特卡洛模拟进行数字分析。特别是,我们获得一个阶段图,根据培训数据集和网络储存的质量与数量等控制参数的功能,总结其性能,这在大型网络规模和无结构数据集的限度内是有效的。此外,我们还在统计机械学习中通常使用的统计机械和损失功能中标准使用的宏观观察功能之间建立了桥梁。从分析的角度看,我们在格拉的内插中进行大量偏差和稳定性分析,以解决与后合成潜力有关的非高加索分布问题,而从计算对应数据中,我们在蒙特卡洛计划中插入Plefka近距离,以加快对合成变压器的评价,从总体上获得调查一般神经网络的新而广泛的方法。