We develop a contrastive framework for learning better prior distributions for Bayesian Neural Networks (BNNs) using unlabelled data. With this framework, we propose a practical BNN algorithm that offers the label-efficiency of self-supervised learning and the principled uncertainty estimates of Bayesian methods. Finally, we demonstrate the advantages of our approach for data-efficient learning in semi-supervised and low-budget active learning problems.
翻译:我们采用对比方法开发了一个框架,利用未标记数据来学习更好的贝叶斯神经网络(BNNs)先验分布。通过这个框架,我们提出了一个实用的BNN算法,它既具有自监督学习的标签效率,又具有贝叶斯方法的有理不确定性估计。最后,我们演示了我们的方法在半监督学习和低预算主动学习问题中实现数据有效的学习的优点。