Multiclass neural networks are a common tool in modern unsupervised domain adaptation, yet an appropriate theoretical description for their non-uniform sample complexity is lacking in the adaptation literature. To fill this gap, we propose the first PAC-Bayesian adaptation bounds for multiclass learners. We facilitate practical use of our bounds by also proposing the first approximation techniques for the multiclass distribution divergences we consider. For divergences dependent on a Gibbs predictor, we propose additional PAC-Bayesian adaptation bounds which remove the need for inefficient Monte-Carlo estimation. Empirically, we test the efficacy of our proposed approximation techniques as well as some novel design-concepts which we include in our bounds. Finally, we apply our bounds to analyze a common adaptation algorithm that uses neural networks.
翻译:多级神经网络是现代不受监督的领域适应中的一种常见工具,但在适应文献中缺乏对其非统一样本复杂性的适当理论描述。 为了填补这一空白,我们为多级学习者提出了第一个PAC-Bayesian适应边框。 我们还为实际使用我们的界限提供了便利,为我们考虑的多级分布差异也提出了第一个近似技术。对于取决于Gibbs预测器的差异,我们提出了额外的PAC-Bayesian适应边框,从而消除了低效率的Monte-Carlo估计的必要性。 生动地说,我们测试了我们提议的近似技术以及我们包含在我们界限中的一些新颖的设计概念的功效。 最后,我们运用我们的界限来分析使用神经网络的共同适应算法。