Evaluating lesion progression and treatment response via longitudinal lesion tracking plays a critical role in clinical practice. Automated approaches for this task are motivated by prohibitive labor costs and time consumption when lesion matching is done manually. Previous methods typically lack the integration of local and global information. In this work, we propose a transformer-based approach, termed Transformer Lesion Tracker (TLT). Specifically, we design a Cross Attention-based Transformer (CAT) to capture and combine both global and local information to enhance feature extraction. We also develop a Registration-based Anatomical Attention Module (RAAM) to introduce anatomical information to CAT so that it can focus on useful feature knowledge. A Sparse Selection Strategy (SSS) is presented for selecting features and reducing memory footprint in Transformer training. In addition, we use a global regression to further improve model performance. We conduct experiments on a public dataset to show the superiority of our method and find that our model performance has improved the average Euclidean center error by at least 14.3% (6mm vs. 7mm) compared with the state-of-the-art (SOTA). Code is available at https://github.com/TangWen920812/TLT.
翻译:在临床实践中,通过纵向病变跟踪评估病变进展和治疗反应具有关键作用。这项任务的自动化方法是由人工完成病变匹配时令人望而却步的人工成本和时间消耗驱动的。以往的方法通常缺乏当地和全球信息的整合。在这项工作中,我们提议了一个基于变压器的方法,称为变压器脱轨跟踪器(TLT)。具体地说,我们设计了一个基于跨关注的变压器(CAT)来捕捉和结合全球和地方信息,以加强特征提取。我们还开发了一个基于注册的解剖关注模块(RAAM),向CAT引入解剖学信息,以便它能够侧重于有用的特征知识。在变压器培训中,提出了用于选择特征和减少记忆足迹的简略选择战略(SSSS)。此外,我们利用全球回归来进一步改进模型性能。我们在一个公共数据集上进行实验,以显示我们方法的优越性,发现我们的模型性能至少改善了Euclidean中心误差14.3% (6mm vs. 7mm) 和We-art(SOT.T.T.T.T.)) 。代码可查到 http://g.