The spectacular success of deep generative models calls for quantitative tools to measure their statistical performance. Divergence frontiers have recently been proposed as an evaluation framework for generative models, due to their ability to measure the quality-diversity trade-off inherent to deep generative modeling. We establish non-asymptotic bounds on the sample complexity of divergence frontiers. We also introduce frontier integrals which provide summary statistics of divergence frontiers. We show how smoothed estimators such as Good-Turing or Krichevsky-Trofimov can overcome the missing mass problem and lead to faster rates of convergence. We illustrate the theoretical results with numerical examples from natural language processing and computer vision.
翻译:深层基因模型的巨大成功要求有定量工具来衡量其统计工作绩效。最近,由于能够衡量深层基因模型所固有的质量多样性权衡,人们提议将差异边界作为基因模型的评估框架。我们为差异边界的抽样复杂程度确定了非抽取界限。我们还引入了提供差异边界简要统计数据的边界组合。我们展示了良好图林或克里切夫斯基-托莫夫夫等光滑的测算员如何克服缺失的大规模问题,并导致更快的汇合率。我们用自然语言处理和计算机视觉的数字实例来说明理论结果。