Offset Rademacher complexities have been shown to imply sharp, data-dependent upper bounds for the square loss in a broad class of problems including improper statistical learning and online learning. We show that in the statistical setting, the offset complexity upper bound can be generalized to any loss satisfying a certain uniform convexity condition. Amazingly, this condition is shown to also capture exponential concavity and self-concordance, uniting several apparently disparate results. By a unified geometric argument, these bounds translate directly to improper learning in a non-convex class using Audibert's "star algorithm." As applications, we recover the optimal rates for proper and improper learning with the $p$-loss, $1 < p < \infty$ and show that improper variants of empirical risk minimization can attain fast rates for logistic regression and other generalized linear models.
翻译:Rademacher 的复杂情况表明,在包括不适当的统计学习和在线学习在内的一系列广泛问题中,平方损失的高度是尖锐的、数据依赖的上限。我们表明,在统计环境中,抵消的复杂程度可以普遍化为任何符合某种一致性条件的损失。令人惊讶的是,这一条件还显示了指数性混凝土和自相符合,将若干明显不同的结果结合在一起。通过统一的几何论,这些界限直接转化为使用Audibert的“星级算法”在非凝固类中不当学习。作为应用,我们用美元损失、1美元 < p < intfty$ > 来恢复适当和不适当学习的最佳费率,并表明不适当的实验风险最小化变种可以达到快速的物流回归率和其他通用线性模型。