COVID-19, due to its accelerated spread has brought in the need to use assistive tools for faster diagnosis in addition to typical lab swab testing. Chest X-Rays for COVID cases tend to show changes in the lungs such as ground glass opacities and peripheral consolidations which can be detected by deep neural networks. However, traditional convolutional networks use point estimate for predictions, lacking in capture of uncertainty, which makes them less reliable for adoption. There have been several works so far in predicting COVID positive cases with chest X-Rays. However, not much has been explored on quantifying the uncertainty of these predictions, interpreting uncertainty, and decomposing this to model or data uncertainty. To address these needs, we develop a visualization framework to address interpretability of uncertainty and its components, with uncertainty in predictions computed with a Bayesian Convolutional Neural Network. This framework aims to understand the contribution of individual features in the Chest-X-Ray images to predictive uncertainty. Providing this as an assistive tool can help the radiologist understand why the model came up with a prediction and whether the regions of interest captured by the model for the specific prediction are of significance in diagnosis. We demonstrate the usefulness of the tool in chest x-ray interpretation through several test cases from a benchmark dataset.
翻译:COVID-19, 由于其传播速度加快,因此除了典型的实验室抽盘测试外,还需要使用辅助工具更快地进行诊断。COVID病例的胸胸X射线往往显示肺部的变化,如深神经网络可以检测到的地面玻璃孔径和外围固化,然而,传统革命网络使用预测的点估计值,缺乏不确定性的捕捉,因而难以被采纳。到目前为止,在预测胸前X射线的COVID阳性病例方面已经做了一些工作。然而,在量化这些预测的不确定性、解释不确定性以及将其分解成模型或数据不确定性方面,没有进行多少探讨。为了满足这些需要,我们制定了一个可视化框架,以解决不确定性及其组成部分的可解释性,同时用巴耶斯的革命神经网络计算出预测的不确定性。这一框架旨在了解Chest-X射线图像中个别特征对预测不确定性的贡献。提供这一辅助工具有助于放射学家理解模型为什么在预测中出现不确定性,并且从分析中看出具体的数据是否具有价值。我们通过一些测试案例来证明。