Overall survival (OS) time is one of the most important evaluation indices for gliomas situations. Multimodal Magnetic Resonance Imaging (MRI) scans play an important role in the study of glioma prognosis OS time. Several deep learning-based methods are proposed for the OS time prediction on multi-modal MRI problems. However, these methods usually fuse multi-modal information at the beginning or at the end of the deep learning networks and lack the fusion of features from different scales. In addition, the fusion at the end of networks always adapts global with global (eg. fully connected after concatenation of global average pooling output) or local with local (eg. bilinear pooling), which loses the information of local with global. In this paper, we propose a novel method for multi-modal OS time prediction of brain tumor patients, which contains an improved nonlocal features fusion module introduced on different scales. Our method obtains a relative 8.76% improvement over the current state-of-art method (0.6989 vs. 0.6426 on accuracy). Extensive testing demonstrates that our method could adapt to situations with missing modalities. The code is available at https://github.com/TangWen920812/mmmna-net.
翻译:总体生存时间(OS)是对于浮质情况最重要的评估指数之一。 多模式磁共振成像(MRI)扫描在对浮质速成OS时间的研究中发挥着重要作用。提出了几种深层次的学习方法,用于对多模式磁共振问题进行OS时间预测。然而,这些方法通常在深层次学习网络开始时或结束时结合多模式信息,缺乏不同规模的特征融合。此外,网络端端的聚合总是与全球(例如全球平均集合输出相融合后完全连接)或与本地(双线集合)进行全球适应(例如双线集合)或与本地(例如双线集合)进行适应。在本文件中,我们提出了对脑肿瘤病人进行多模式OS时间预测的新方法,其中包括在不同尺度上引入的改进的非本地特性融合模块。我们的方法比当前状态艺术方法(0.6989 v. 0. 64226) 得到相对的8.76%的改进。 广泛测试显示,我们的方法可以与缺失的MAT/MQmmmm/Wecomm 进行修改。