Recently, alpha matting has received a lot of attention because of its usefulness in mobile applications such as selfies. Therefore, there has been a demand for a lightweight alpha matting model due to the limited computational resources of commercial portable devices. To this end, we suggest a distillation-based channel pruning method for the alpha matting networks. In the pruning step, we remove channels of a student network having fewer impacts on mimicking the knowledge of a teacher network. Then, the pruned lightweight student network is trained by the same distillation loss. A lightweight alpha matting model from the proposed method outperforms existing lightweight methods. To show superiority of our algorithm, we provide various quantitative and qualitative experiments with in-depth analyses. Furthermore, we demonstrate the versatility of the proposed distillation-based channel pruning method by applying it to semantic segmentation.
翻译:最近,阿尔法交配因其在自体等移动应用中的有用性而引起了人们的极大关注。 因此,由于商业便携式设备的计算资源有限,人们需要一种轻量级的阿尔法交配模型。 为此,我们建议对阿尔法交配网络采用一种基于蒸馏的频道修剪方法。在裁剪过程中,我们删除了对模仿教师网络知识影响较小的学生网络的渠道。然后,轻量级学生网络受到同样的蒸馏损失的培训。从拟议方法中得出的轻量级的阿尔法交配模型比现有的轻量方法要好。为了显示我们的算法的优越性,我们提供了各种定量和定性实验,并进行了深入分析。此外,我们通过对语义分解法应用来展示了拟议的以蒸馏为主的频道剪裁方法的多功能性。