In the current salient object detection network, the most popular method is using U-shape structure. However, the massive number of parameters leads to more consumption of computing and storage resources which are not feasible to deploy on the limited memory device. Some others shallow layer network will not maintain the same accuracy compared with U-shape structure and the deep network structure with more parameters will not converge to a global minimum loss with great speed. To overcome all of these disadvantages, we proposed a new deep convolution network architecture with three contributions: (1) using smaller convolution neural networks (CNNs) to compress the model in our improved salient object features compression and reinforcement extraction module (ISFCREM) to reduce parameters of the model. (2) introducing channel attention mechanism in ISFCREM to weigh different channels for improving the ability of feature representation. (3) applying a new optimizer to accumulate the long-term gradient information during training to adaptively tune the learning rate. The results demonstrate that the proposed method can compress the model to 1/3 of the original size nearly without losing the accuracy and converging faster and more smoothly on six widely used datasets of salient object detection compared with the others models. Our code is published in https://gitee.com/binzhangbinzhangbin/code-a-novel-attention-based-network-for-fast-salient-object-detection.git
翻译:在目前的显要天体探测网络中,最受欢迎的方法是使用U-shape结构。然而,大量参数导致更多地消耗计算和储存资源,而这些在有限的内存装置上部署是不可行的。其他一些浅层网络与U-shape结构相比不会保持同样的准确性,而具有更多参数的深层网络结构不会以极大的速度达到全球最低损失。为了克服所有这些缺点,我们提议一个新的深层变速网络结构,有三种贡献:(1) 使用较小的卷流神经网络(CNNs)来压缩我们改进的突出物体压缩和增强提取模块(ISFCREM)中的模型,以减少模型的参数。(2) 在ISFCREM中引入频道关注机制,以权衡不同渠道来提高地貌代表能力。(3) 应用新的优化器在培训期间积累长期梯度信息以适应性调整学习率。结果表明,拟议的方法可以将模型压缩到最初大小的三分之一,而不会丧失精度和对六种广泛使用的显要性物体压缩的压缩和凝固的压缩模型。在ASFSB-hang-hard-hard-hangstal-hard-stbas-haz-stal-combas-bas-bas-bas-drodrodrodrodrops-drodrodrodrodrodrodroutdal-al-al-droutdroutdroutdal-dal-dal-dal-modal-modal-drodal-drodrodal-drodal-modrodal-drodrodrodal-modal-modrodrodrod-modrodrodal-modrodal-mod-mod-modal-modrod-mod-modal-modal-modal-modal-modal-modal-modal-modal-modal-modal-modal-mod-mod-modrod-mod-st-st-st-st-modrodrod-st-st-st-commad-st-commod-mod-mod-modrodrodal-