This paper extends standard results from learning theory with independent data to sequences of dependent data. Contrary to most of the literature, we do not rely on mixing arguments or sequential measures of complexity and derive uniform risk bounds with classical proof patterns and capacity measures. In particular, we show that the standard classification risk bounds based on the VC-dimension hold in the exact same form for dependent data, and further provide Rademacher complexity-based bounds, that remain unchanged compared to the standard results for the identically and independently distributed case. Finally, we show how to apply these results in the context of scenario-based optimization in order to compute the sample complexity of random programs with dependent constraints.
翻译:本文将独立数据学习理论的标准结果推广到相关数据序列上。与大多数文献不同,我们不依赖于混合论证或复杂度的顺序度量,并使用经典证明模式和容量度量导出均匀风险下限。特别地,我们展示了基于 VC 维度的标准分类风险下限在相关数据的情况下同样成立,并进一步提供了基于 Rademacher 复杂度的下限,相比于独立同分布情况下的标准结果保持不变。最后,我们展示了如何在场景优化的背景下应用这些结果,以计算具有相关约束的随机程序的样本复杂度。