Despite the progress in utilizing deep learning to automate chest radiograph interpretation and disease diagnosis tasks, change between sequential Chest X-rays (CXRs) has received limited attention. Monitoring the progression of pathologies that are visualized through chest imaging poses several challenges in anatomical motion estimation and image registration, i.e., spatially aligning the two images and modeling temporal dynamics in change detection. In this work, we propose CheXRelNet, a neural model that can track longitudinal pathology change relations between two CXRs. CheXRelNet incorporates local and global visual features, utilizes inter-image and intra-image anatomical information, and learns dependencies between anatomical region attributes, to accurately predict disease change for a pair of CXRs. Experimental results on the Chest ImaGenome dataset show increased downstream performance compared to baselines. Code is available at https://github.com/PLAN-Lab/ChexRelNet
翻译:尽管在利用深层学习实现胸前射线解释和疾病诊断任务自动化方面取得了进展,但连续的胸前X射线(CXRs)之间的变化受到的注意有限。通过胸前成像可视化的病理进展监测在解剖运动估计和图像登记方面带来了若干挑战,即在空间上对两种图像进行匹配,在变化检测中模拟时间动态。在这项工作中,我们提议CheXRelNet,这是一个神经模型,可以跟踪两个CXRs之间的纵向病理变化关系。CheXRelNet包含当地和全球的视觉特征,利用图像间和图像内解剖学信息,并学习解剖区域属性之间的依赖性,准确预测两对CXR的疾病变化。Chest ImaGenome数据的实验结果显示,与基线相比,下游的性能有所增加。代码可在https://github.com/PLAN-Lab/CexRelNet上查阅。