Sharpness-aware minimization (SAM) and related adversarial deep-learning methods can drastically improve generalization, but their underlying mechanisms are not yet fully understood. Here, we establish SAM as a relaxation of the Bayes objective where the expected negative-loss is replaced by the optimal convex lower bound, obtained by using the so-called Fenchel biconjugate. The connection enables a new Adam-like extension of SAM to automatically obtain reasonable uncertainty estimates, while sometimes also improving its accuracy. By connecting adversarial and Bayesian methods, our work opens a new path to robustness.
翻译:快速意识最小化(SAM)及相关的对抗性深层学习方法可以极大地改善一般化,但其基本机制尚未完全被理解。 在这里,我们把SAM确定为贝耶斯目标的放松,通过使用所谓的Fenchel双关(Fenchel biconjugate)获得的最佳螺旋下限取代预期的负损失。 连接使得类似于Adam的SAM的新型扩展能够自动获得合理的不确定性估计值,有时还可以提高准确性。 通过连接对立和巴耶斯方法,我们的工作开辟了一条通往稳健的新道路。