Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a single training example. However, these sample-wise bounds were derived only for expected generalization gap. We show that even for expected squared generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.
翻译:某些最严格的信息――理论概括性界限取决于所学假设和单一培训实例之间的平均信息。然而,这些抽样的界限只针对预期的概括性差距。我们表明,即使预期的平方一般化差距,也不存在这种抽样信息――理论性界限。PAC-Bayes和单拖网界限也是如此。值得注意的是,PAC-Bayes、单拖网和预期的平方一般化界限都存在,这些界限取决于一对实例中的信息。