Dropout is conventionally used during the training phase as regularization method and for quantifying uncertainty in deep learning. We propose to use dropout during training as well as inference steps, and average multiple predictions to improve the accuracy, while reducing and quantifying the uncertainty. The results are evaluated for fractional anisotropy (FA) and mean diffusivity (MD) maps which are obtained from only 3 direction scans. With our method, accuracy can be improved significantly compared to network outputs without dropout, especially when the training dataset is small. Moreover, confidence maps are generated which may aid in diagnosis of unseen pathology or artifacts.
翻译:在培训阶段,辍学通常被用作正规化方法,用于量化深层学习的不确定性。我们提议在培训期间使用辍学和推断步骤,并采用平均多次预测来提高准确性,同时减少和量化不确定性。对结果进行评估时使用的是分片厌食性(FA)和平均偏差(MD)地图,这些地图仅取自3个方向扫描。用我们的方法,准确性与网络产出相比可以大大提高,而不会辍学,特别是当培训数据集很小时。此外,还生成了信任性地图,有助于诊断看不见的病理学或文物。