In object detection, non-maximum suppression (NMS) methods are extensively adopted to remove horizontal duplicates of detected dense boxes for generating final object instances. However, due to the degraded quality of dense detection boxes and not explicit exploration of the context information, existing NMS methods via simple intersection-over-union (IoU) metrics tend to underperform on multi-oriented and long-size objects detection. Distinguishing with general NMS methods via duplicate removal, we propose a novel graph fusion network, named GFNet, for multi-oriented object detection. Our GFNet is extensible and adaptively fuse dense detection boxes to detect more accurate and holistic multi-oriented object instances. Specifically, we first adopt a locality-aware clustering algorithm to group dense detection boxes into different clusters. We will construct an instance sub-graph for the detection boxes belonging to one cluster. Then, we propose a graph-based fusion network via Graph Convolutional Network (GCN) to learn to reason and fuse the detection boxes for generating final instance boxes. Extensive experiments both on public available multi-oriented text datasets (including MSRA-TD500, ICDAR2015, ICDAR2017-MLT) and multi-oriented object datasets (DOTA) verify the effectiveness and robustness of our method against general NMS methods in multi-oriented object detection.
翻译:在物体探测中,广泛采用了非最大抑制方法,以清除为生成最终物体发生情况而发现的密盒的横向重复,然而,由于密集探测箱质量降低,而且没有明确探索背景信息,现有NMS方法通过简单交叉交叉式超联合(IoU)衡量标准往往在多方向和长尺寸天体探测方面表现不佳;与一般NMS方法相区别,通过重复清除,我们提议建立一个名为GFNet的新颖的图形聚合网络,用于多方向天体探测;我们的GFNet是可扩展和适应的引信密集探测箱,以探测更准确和综合的多方向天体实例;具体而言,我们首先采用有地觉识的组合算法,将密集探测箱分组到不同的群组中;我们将为属于一个集群的探测箱建立一个实例分图谱;然后,我们提议一个基于图形的聚合网络,通过图变动网络,学习理由和连接生成最后实验箱的探测箱;在公众可获取的多方向文本数据集(包括MS-TD500、ICDAR-TA)和多方向的ICD-D-D-D-S-S-RO-G-S-S-SD-IA-ID-ICD-A-ICD-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA-MA