Traditional change detection methods based on convolutional neural networks (CNNs) face the challenges of speckle noise and deformation sensitivity for synthetic aperture radar images. To mitigate these issues, we proposed a Multiscale Capsule Network (Ms-CapsNet) to extract the discriminative information between the changed and unchanged pixels. On the one hand, the capsule module is employed to exploit the spatial relationship of features. Therefore, equivariant properties can be achieved by aggregating the features from different positions. On the other hand, an adaptive fusion convolution (AFC) module is designed for the proposed Ms-CapsNet. Higher semantic features can be captured for the primary capsules. Feature extracted by the AFC module significantly improves the robustness to speckle noise. The effectiveness of the proposed Ms-CapsNet is verified on three real SAR datasets. The comparison experiments with four state-of-the-art methods demonstrated the efficiency of the proposed method. Our codes are available at https://github.com/summitgao/SAR_CD_MS_CapsNet.
翻译:以进化神经网络为基础的传统变化探测方法面临着合成孔径雷达图像的分光噪声和变形灵敏度的挑战,为了缓解这些问题,我们提议建立一个多尺度的胶囊网络(Ms-CapsNet),以在变化和未变的像素之间提取歧视性信息;一方面,胶囊模块用于利用各种特征的空间关系,因此,通过综合不同位置的特征可以实现等同性特性;另一方面,为拟议的CapsNet女士设计了一个适应性融合模块(AFC)。为主舱可以捕捉高级的语义特征。AFC模块所提取的特征大大提高了对分光噪音的强度。拟议的CapsNet的有效性在三个真实的合成孔径雷达数据集上得到验证。与四种最先进的方法进行比较实验证明了拟议方法的效率。我们的代码可在https://github.com/summitgao/SAR_CD_MS_CapsNet上查阅。