Fetal standard scan plane detection during 2-D mid-pregnancy examinations is a highly complex task, which requires extensive medical knowledge and years of training. Although deep neural networks (DNN) can assist inexperienced operators in these tasks, their lack of transparency and interpretability limit their application. Despite some researchers have been committed to visualizing the decision process of DNN, most of them only focus on the pixel-level features and do not take into account the medical prior knowledge. In this work, we propose an interpretable framework based on key medical concepts, which provides explanations from the perspective of clinicians' cognition. Moreover, we utilize a concept-based graph convolutional neural(GCN) network to construct the relationships between key medical concepts. Extensive experimental analysis on a private dataset has shown that the proposed method provides easy-to-understand insights about reasoning results for clinicians.
翻译:2-D中期孕检期间的胎儿标准扫描面检测是一项高度复杂的任务,需要广泛的医学知识和多年的培训。虽然深度神经网络(DNN)可以辅助经验不足的操作者进行这些任务,但它们缺乏透明度和解释性限制了它们的应用。尽管一些研究人员致力于可视化DNN的决策过程,但他们大多数只关注像素级特征,而不考虑医学先验知识。在本研究中,我们提出了一个基于关键医学概念的可解释框架,从临床医师的认知角度提供解释。此外,我们利用一个基于概念的图卷积神经(GCN)网络来构建关键医学概念之间的关系。对私人数据集的广泛实验分析表明,所提出的方法为临床医生提供了易于理解的推理结果洞察。