Learning pyramidal feature representations is crucial for recognizing object instances at different scales. Feature Pyramid Network (FPN) is the classic architecture to build a feature pyramid with high-level semantics throughout. However, intrinsic defects in feature extraction and fusion inhibit FPN from further aggregating more discriminative features. In this work, we propose Attention Aggregation based Feature Pyramid Network (A^2-FPN), to improve multi-scale feature learning through attention-guided feature aggregation. In feature extraction, it extracts discriminative features by collecting-distributing multi-level global context features, and mitigates the semantic information loss due to drastically reduced channels. In feature fusion, it aggregates complementary information from adjacent features to generate location-wise reassembly kernels for content-aware sampling, and employs channel-wise reweighting to enhance the semantic consistency before element-wise addition. A^2-FPN shows consistent gains on different instance segmentation frameworks. By replacing FPN with A^2-FPN in Mask R-CNN, our model boosts the performance by 2.1% and 1.6% mask AP when using ResNet-50 and ResNet-101 as backbone, respectively. Moreover, A^2-FPN achieves an improvement of 2.0% and 1.4% mask AP when integrated into the strong baselines such as Cascade Mask R-CNN and Hybrid Task Cascade.
翻译:学习金字塔特征的表示方式对于在不同尺度上识别物体实例至关重要。 地貌金字塔网络( FPN) 是建立具有高层次语义学的特征金字塔的经典建筑结构。 但是,地貌提取和聚合的内在缺陷使得FPN无法进一步集聚更具歧视性的特征。 在这项工作中,我们建议关注聚合基于地貌金字网(A2-FPN),通过关注引导特征聚合,通过关注引导特征聚合,改进多规模特征学习。 在特征提取中,它通过收集分布的多级全球背景特征来提取歧视性特征特征,并减轻由于频道大幅缩小而导致的语义信息损失。 在特性聚合中,它汇集了来自邻近特征的互补信息,以产生基于位置的重新组合内脏内容取样的内脏信息。 在使用 ResNet-N-N101和AS-MFAS 分别以2.1% 和1.6% AS-MIS-maismail 分别以2.1% 和1.6% AS- mas-max AS- mex- mass- AS- ASemplas- mass- mass- mass- mass- mass- mass- mass- mass- mass- mess- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass- mass-mal- mass- mass- mass-mal-le-le-l-l 和/