This paper proposes an Agile Aggregating Multi-Level feaTure framework (Agile Amulet) for salient object detection. The Agile Amulet builds on previous works to predict saliency maps using multi-level convolutional features. Compared to previous works, Agile Amulet employs some key innovations to improve training and testing speed while also increase prediction accuracy. More specifically, we first introduce a contextual attention module that can rapidly highlight most salient objects or regions with contextual pyramids. Thus, it effectively guides the learning of low-layer convolutional features and tells the backbone network where to look. The contextual attention module is a fully convolutional mechanism that simultaneously learns complementary features and predicts saliency scores at each pixel. In addition, we propose a novel method to aggregate multi-level deep convolutional features. As a result, we are able to use the integrated side-output features of pre-trained convolutional networks alone, which significantly reduces the model parameters leading to a model size of 67 MB, about half of Amulet. Compared to other deep learning based saliency methods, Agile Amulet is of much lighter-weight, runs faster (30 fps in real-time) and achieves higher performance on seven public benchmarks in terms of both quantitative and qualitative evaluation.
翻译:本文建议了用于显要物体探测的“ 聚合多层次远流框架 ” ( Agile Amulet) 。 Agile Amulet 以先前的工程为基础,用多层共变特征预测显要地图。与以往的工程相比, Agile Amulet 采用一些关键创新方法,以提高培训和测试速度,同时提高预测准确性。更具体地说,我们首先引入一个背景关注模块,该模块可以快速突出显示最突出的物体或具有背景金字塔的区域。因此,该模块有效地指导了低层共变特征的学习,并告诉主干网看哪里。背景关注模块是一个完全的演进机制,同时学习互补特征,预测每个像素的显著得分。此外,我们提出了一种创新方法,以综合多层次深层共变异特征,同时提高预测准确性。因此,我们可以仅使用受过训练的共变网络的综合副作用特征,从而大大降低了导致模型大小为67 MB的模型, 大约一半Amult 。 与其他基于深层次的显著学习方法相比, 。 Agile Amulet am am- am am- breal- breal- breal- breal- best- breme- birth- birth- birth- press- sirth- simpress- simpral- simpressal- sirth- press- fir- press- sirxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx pral- pral- pral- pral-xxxxxxxxxxxx pral- fal- fal- fal- fal- fal- pral- pral- fal- pral- pral- pral- pral- pral- pral- pral- pral- pral-xal- pral- pral- pral- pral- pral-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx