Deep learning-based pavement cracks detection methods often require large-scale labels with detailed crack location information to learn accurate predictions. In practice, however, crack locations are very difficult to be manually annotated due to various visual patterns of pavement crack. In this paper, we propose a Deep Domain Adaptation-based Crack Detection Network (DDACDN), which learns to take advantage of the source domain knowledge to predict the multi-category crack location information in the target domain, where only image-level labels are available. Specifically, DDACDN first extracts crack features from both the source and target domain by a two-branch weights-shared backbone network. And in an effort to achieve the cross-domain adaptation, an intermediate domain is constructed by aggregating the three-scale features from the feature space of each domain to adapt the crack features from the source domain to the target domain. Finally, the network involves the knowledge of both domains and is trained to recognize and localize pavement cracks. To facilitate accurate training and validation for domain adaptation, we use two challenging pavement crack datasets CQU-BPDD and RDD2020. Furthermore, we construct a new large-scale Bituminous Pavement Multi-label Disease Dataset named CQU-BPMDD, which contains 38994 high-resolution pavement disease images to further evaluate the robustness of our model. Extensive experiments demonstrate that DDACDN outperforms state-of-the-art pavement crack detection methods in predicting the crack location on the target domain.
翻译:深入学习的地道裂缝探测方法往往需要大型标签,并配有详细的裂缝位置信息,以了解准确的预测。但在实践中,由于铺路裂缝的各种视觉模式,裂缝地点很难手工加注。在本文件中,我们提议建立一个基于深域适应的裂缝探测网(DDCDN),利用源域知识来预测目标领域的多类裂缝位置信息,而目标领域只有图像级标签。具体地说,DDDADDN首先从源域和目标域提取裂缝特征,通过两层重量共享的骨干网络从源域和目标域提取裂缝特征。为了实现跨界域适应,将每个域的地貌三层特征集中起来,将源域的裂缝特征从源域调整到目标域。最后,这个网络涉及两个域的知识,并经过培训,以识别和确认铺路缝路缝。为了便于对域适应进行准确的培训和验证,我们使用两个具有挑战性的铺路裂数据集,CU-BDDDDD和RD20 Strea-S-laimal Steal deal deal deal deal 3 lavemental dal daldaldal 。我们用了一个大规模BD-DDD-DD-D-D-D-DDDDDDDM 高分辨率的大规模地路路路路路路路路段测量图图图图图图测量图测量方法,我们为MD-D-D-D-D-D-D-D-DDMDMDDDMDMD-D-D-D-D-D-DMDMDMD-D-D-DMDMD-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-D-DMDMDMDMDMDMDMDMDMDMDBDBDMDMDMDMDMDMDMDMDMDMDMD-DMDADADAD-D-D-D-D-D-DMD-D-D-D-D-D-D-D-D-D-D-D-