Generalization error bounds are essential to understanding machine learning algorithms. This paper presents novel expected generalization error upper bounds based on the average joint distribution between the output hypothesis and each input training sample. Multiple generalization error upper bounds based on different information measures are provided, including Wasserstein distance, total variation distance, KL divergence, and Jensen-Shannon divergence. Due to the convexity of the information measures, the proposed bounds in terms of Wasserstein distance and total variation distance are shown to be tighter than their counterparts based on individual samples in the literature. An example is provided to demonstrate the tightness of the proposed generalization error bounds.
翻译:通用错误界限对于理解机器学习算法至关重要。 本文根据产出假设和每个输入培训样本之间的平均联合分布,提出了新的预期通用错误上限界限。 提供了基于不同信息计量的多个通用错误上限界限, 包括瓦瑟斯坦距离、 全部变异距离、 KL 差异和 Jensen- Shannon 差异。 由于信息计量的混杂性, 瓦瑟斯坦距离和总变异距离的拟议界限比文献中单个样本的对应范围更紧。 举例可以说明拟议通用错误界限的紧凑性。