The recently discovered Neural Collapse (NC) phenomenon occurs pervasively in today's deep net training paradigm of driving cross-entropy (CE) loss towards zero. During NC, last-layer features collapse to their class-means, both classifiers and class-means collapse to the same Simplex Equiangular Tight Frame, and classifier behavior collapses to the nearest-class-mean decision rule. Recent works demonstrated that deep nets trained with mean squared error (MSE) loss perform comparably to those trained with CE. We empirically establish that NC emerges in such MSE-trained deep nets as well through experiments on three canonical networks and five benchmark datasets. We provide, in a Google Colab notebook, PyTorch code for reproducing MSE-NC and CE-NC: https://colab.research.google.com/github/neuralcollapse/neuralcollapse/blob/main/neuralcollapse.ipynb. The analytically-tractable MSE loss offers more mathematical opportunities than the hard-to-analyze CE loss, inspiring us to leverage MSE loss towards the theoretical investigation of NC. We develop three main contributions: (I) We show a new decomposition of the MSE loss into (A) terms directly interpretable through the lens of NC and which assume the last-layer classifier is exactly the least-squares classifier; and (B) a term capturing the deviation from this least-squares classifier. (II) We exhibit experiments on canonical datasets and networks demonstrating that term-(B) is negligible during training. This motivates us to introduce a new theoretical construct: the central path, where the linear classifier stays MSE-optimal for feature activations throughout the dynamics. (III) By studying renormalized gradient flow along the central path, we derive exact dynamics that predict NC.
翻译:最近发现的神经折叠(NC) 现象在今天将跨肾(CE)损失推向零的深层净培训模式中普遍发生。 在 NC 期间, 上层特征会塌落到他们的阶级对象, 分类者和阶级对象会崩溃到同样的简单度( Equiacle Tight Fram), 分类行为会崩溃到最近的级别决定规则 。 最近的工作表明, 受过平均正方差( MSE) 损失训练的深网( MSE) 与受过 CE 训练的人相当。 我们实验性地证实, 由MSE训练的深网会出现在这种经过MSE训练的深层网中, 以及通过三个基准数据集的实验 。 我们用Google Colab笔记, PyTorrch 代码来复制 MSE 和 CENC: