In active learning, the size and complexity of the training dataset changes over time. Simple models that are well specified by the amount of data available at the start of active learning might suffer from bias as more points are actively sampled. Flexible models that might be well suited to the full dataset can suffer from overfitting towards the start of active learning. We tackle this problem using Depth Uncertainty Networks (DUNs), a BNN variant in which the depth of the network, and thus its complexity, is inferred. We find that DUNs outperform other BNN variants on several active learning tasks. Importantly, we show that on the tasks in which DUNs perform best they present notably less overfitting than baselines.
翻译:在积极学习中,培训数据集的大小和复杂程度随时间而变化。在积极学习开始时,由数据数量明确指定的简单模型可能会随着更多点被积极抽样而产生偏差。很可能完全适合完整数据集的灵活模型可能会由于过于适应而导致积极学习的开始。我们用深度不确定性网络(DUNs)来解决这个问题,这是一个BNN变量,其中可以推断网络的深度和复杂性。我们发现DUNs在几个积极学习任务上优于其他BNN变量。重要的是,我们显示在DUNs最出色地完成的任务上,它们明显比基线要差。