Automatic tumor or lesion segmentation is a crucial step in medical image analysis for computer-aided diagnosis. Although the existing methods based on Convolutional Neural Networks (CNNs) have achieved the state-of-the-art performance, many challenges still remain in medical tumor segmentation. This is because, although the human visual system can detect symmetries in 2D images effectively, regular CNNs can only exploit translation invariance, overlooking further inherent symmetries existing in medical images such as rotations and reflections. To solve this problem, we propose a novel group equivariant segmentation framework by encoding those inherent symmetries for learning more precise representations. First, kernel-based equivariant operations are devised on each orientation, which allows it to effectively address the gaps of learning symmetries in existing approaches. Then, to keep segmentation networks globally equivariant, we design distinctive group layers with layer-wise symmetry constraints. Finally, based on our novel framework, extensive experiments conducted on real-world clinical data demonstrate that a Group Equivariant Res-UNet (named GER-UNet) outperforms its regular CNN-based counterpart and the state-of-the-art segmentation methods in the tasks of hepatic tumor segmentation, COVID-19 lung infection segmentation and retinal vessel detection. More importantly, the newly built GER-UNet also shows potential in reducing the sample complexity and the redundancy of filters, upgrading current segmentation CNNs and delineating organs on other medical imaging modalities.
翻译:在计算机辅助诊断医学图像分析中,自动肿瘤或损伤分解是一个关键步骤。虽然以进化神经网络(CNNs)为基础的现有方法已经达到了最先进的性能,但是在医学肿瘤分解方面仍然存在许多挑战。这是因为,虽然人类视觉系统能够有效地检测2D图像中的对称性,但常规CNN只能利用翻译差异,忽视医疗图像中存在的更固有的内在的对称性,如旋转和反射。为了解决这个问题,我们提议了一个新型的集团间替性分解框架,将这些内在的对称合并起来,以学习更精确的表达方式。首先,在每种方向上都设计了基于内脏的对等性操作,从而使其能够有效地解决现有方法中学习对称性差异性的差距。随后,为了保持全球的对称性网络,我们设计了特殊的组分层层结构,如交错式。最后,我们在现实世界临床数据中进行的广泛实验,通过对内脏的对等性对等性对称,不断递增的内置性内置的内置性内置-内置的内置性内脏内置-内置性内置的内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置-内置