Recently fast arbitrary-shaped text detection has become an attractive research topic. However, most existing methods are non-real-time, which may fall short in intelligent systems. Although a few real-time text methods are proposed, the detection accuracy is far behind non-real-time methods. To improve the detection accuracy and speed simultaneously, we propose a novel fast and accurate text detection framework, namely CM-Net, which is constructed based on a new text representation method and a multi-perspective feature (MPF) module. The former can fit arbitrary-shaped text contours by concentric mask (CM) in an efficient and robust way. The latter encourages the network to learn more CM-related discriminative features from multiple perspectives and brings no extra computational cost. Benefiting the advantages of CM and MPF, the proposed CM-Net only needs to predict one CM of the text instance to rebuild the text contour and achieves the best balance between detection accuracy and speed compared with previous works. Moreover, to ensure that multi-perspective features are effectively learned, the multi-factor constraints loss is proposed. Extensive experiments demonstrate the proposed CM is efficient and robust to fit arbitrary-shaped text instances, and also validate the effectiveness of MPF and constraints loss for discriminative text features recognition. Furthermore, experimental results show that the proposed CM-Net is superior to existing state-of-the-art (SOTA) real-time text detection methods in both detection speed and accuracy on MSRA-TD500, CTW1500, Total-Text, and ICDAR2015 datasets.
翻译:虽然提出了一些实时文本方法,但检测的准确性远远落后于非实时方法。为了同时提高检测的准确性和速度,我们提议建立一个新型的快速和准确文本检测框架,即CM-Net,这个框架是根据新的文本表达法和多视角特征模块构建的。前者可以以高效和稳健的方式,以同心面遮罩(CM)来匹配任意形状的文本等值。后者鼓励网络从多种角度学习与CM有关的歧视性特征,不带来额外的计算成本。为了提高检测的准确性和速度,我们提议CM-Net只需要预测一个文本实例的CM-Net,以重建文本轮廓,实现检测准确性和速度与以往工作的最佳平衡。此外,为了确保以高效和稳健的方式了解多视角特征,提议采用多功能的CMCT限制。广泛的实验显示,从多种视角了解与CMCM-Net-Net有关的C-检测速度特征,以及测试中的拟议测试-测试结果与测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试性测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-结果-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试结果、测试-测试-测试-测试-测试-测试结果、测试-测试-测试-和测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-结果、测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-测试-