Deep learning methods, in particular convolutional neural networks, have emerged as a powerful tool in medical image computing tasks. While these complex models provide excellent performance, their black-box nature may hinder real-world adoption in high-stakes decision-making. In this paper, we propose an interactive system to take advantage of state-of-the-art interpretability techniques to assist radiologists with breast cancer screening. Our system integrates a deep learning model into the radiologists' workflow and provides novel interactions to promote understanding of the model's decision-making process. Moreover, we demonstrate that our system can take advantage of user interactions progressively to provide finer-grained explainability reports with little labeling overhead. Due to the generic nature of the adopted interpretability technique, our system is domain-agnostic and can be used for many different medical image computing tasks, presenting a novel perspective on how we can leverage visual analytics to transform originally static interpretability techniques to augment human decision making and promote the adoption of medical AI.
翻译:深层次的学习方法,特别是进化神经网络,已成为医学图像计算任务的有力工具。这些复杂模型提供了出色的性能,但其黑盒性质可能阻碍在高决策层采用真实世界的解决方案。在本文中,我们提议建立一个互动系统,利用最先进的解释技术,协助放射学家进行乳腺癌筛查。我们的系统将深层学习模型纳入放射学家工作流程,并提供新的互动,以促进对模型决策过程的理解。此外,我们证明我们的系统可以利用用户互动逐步提供精细的可解释性报告,同时提供很少贴标签的间接数据。由于所采用的可解释性技术的通用性质,我们的系统是域名化的,可以用于许多不同的医学图像计算任务,我们如何利用视觉分析学来改变原静态的可解释技术,以加强人类的决策,并促进采用医学人工智能。