This paper introduces a new loss function induced by the Fourier-based Metric. This metric is equivalent to the Wasserstein distance but is computed very efficiently using the Fast Fourier Transform algorithm. We prove that the Fourier loss function is twice differentiable, and we provide the explicit formula for both its gradient and its Hessian matrix. More importantly, we show that minimising the Fourier loss function is equivalent to maximising the likelihood of the data under a Gaussian noise in the space of frequencies. We apply our loss function to a multi-class classification task using MNIST, Fashion-MNIST, and CIFAR10 datasets. The computational results show that, while its accuracy is competitive with other state-of-the-art loss functions, the Fourier loss function is significantly more robust to noisy data.
翻译:本文引入了由基于 Fourier 的Metric 引发的新的损失函数。 这个指标相当于 瓦瑟斯坦 距离, 但是使用快速 Fourier 变换算法非常有效地计算。 我们证明 Fourier 损失函数是两个不同的, 我们为它的梯度和赫西安矩阵提供了明确的公式。 更重要的是, 我们显示, 最小化 Fourier 损失函数相当于在频率空间内将高斯噪音下的数据的可能性最大化。 我们用MNIST、 Fashon- MNIST 和 CIFAR10 数据集将我们的损失函数应用到多级分类任务中。 计算结果显示, 虽然它的准确性与其他最先进的损失函数具有竞争力, 但 Fourier 损失函数对于噪音数据来说要强大得多。