This paper proposes a content relationship distillation (CRD) to tackle the over-parameterized generative adversarial networks (GANs) for the serviceability in cutting-edge devices. In contrast to traditional instance-level distillation, we design a novel GAN compression oriented knowledge by slicing the contents of teacher outputs into multiple fine-grained granularities, such as row/column strips (global information) and image patches (local information), modeling the relationships among them, such as pairwise distance and triplet-wise angle, and encouraging the student to capture these relationships within its output contents. Built upon our proposed content-level distillation, we also deploy an online teacher discriminator, which keeps updating when co-trained with the teacher generator and keeps freezing when co-trained with the student generator for better adversarial training. We perform extensive experiments on three benchmark datasets, the results of which show that our CRD reaches the most complexity reduction on GANs while obtaining the best performance in comparison with existing methods. For example, we reduce MACs of CycleGAN by around 40x and parameters by over 80x, meanwhile, 46.61 FIDs are obtained compared with these of 51.92 for the current state-of-the-art. Code of this project is available at https://github.com/TheKernelZ/CRD.
翻译:本文建议进行内容关系蒸馏(CRD),以解决在尖端设备中可使用的过度参数化基因对抗网络(GANs),处理尖端设备的可应用性。与传统的实例级蒸馏相比,我们设计了一个新型的GAN压缩型知识,将教师产出的内容分解成多种细细微颗粒,如行/柱条(全球信息)和图像补丁(当地信息),建模它们之间的关系,例如对称距离和三维角度,并鼓励学生在其输出内容中捕捉这些关系。在提议的内容级蒸馏中,我们还部署了一名在线教师导师,在与教师发电机共同培训时不断更新,并在与学生发电机共同培训以进行更好的对抗性培训时不断冻结。我们在三个基准数据集上进行了广泛的实验,结果显示我们的CRD在与现有方法相比,在获得最佳性能的同时,我们的CRD达到最复杂的减少GAN/方位。例如,我们将CARCAN的MC值减少约40x和参数在80xx以上的MACS。我们获得的RA-FIC项目中,这是在目前获得的46.92/KIDRBS。