We present a PAC-Bayes-style generalization bound which enables the replacement of the KL-divergence with a variety of Integral Probability Metrics (IPM). We provide instances of this bound with the IPM being the total variation metric and the Wasserstein distance. A notable feature of the obtained bounds is that they naturally interpolate between classical uniform convergence bounds in the worst case (when the prior and posterior are far away from each other), and preferable bounds in better cases (when the posterior and prior are close). This illustrates the possibility of reinforcing classical generalization bounds with algorithm- and data-dependent components, thus making them more suitable to analyze algorithms that use a large hypothesis space.
翻译:我们提出了一个PAC-Bayes式的概括化约束,能够用各种综合概率计量(IPM)取代KL-diverence(IPM)。我们提供了与IPM(IPM)结合的例子,即全变异度和瓦塞斯坦距离。我们获得的界限的一个显著特征是,它们自然地在最坏的情况下(前一和后后二彼此相距遥远)的经典统一趋同界限和更好的(后后后一和前后一和后一接近的)更佳情况下(在后一和后一接近时)的优选界限之间相互交错。 这表明有可能用依赖算法和数据的组成部分加强典型的概括化界限,从而使它们更适合于分析使用大假设空间的算法。