Yes. In this paper, we investigate strong lottery tickets in generative models, the subnetworks that achieve good generative performance without any weight update. Neural network pruning is considered the main cornerstone of model compression for reducing the costs of computation and memory. Unfortunately, pruning a generative model has not been extensively explored, and all existing pruning algorithms suffer from excessive weight-training costs, performance degradation, limited generalizability, or complicated training. To address these problems, we propose to find a strong lottery ticket via moment-matching scores. Our experimental results show that the discovered subnetwork can perform similarly or better than the trained dense model even when only 10% of the weights remain. To the best of our knowledge, we are the first to show the existence of strong lottery tickets in generative models and provide an algorithm to find it stably. Our code and supplementary materials are publicly available.
翻译:是的。 在本文中,我们调查了基因模型中的强力彩票,这些亚网络在没有任何重量更新的情况下取得了良好的基因性能。神经网络修剪被认为是降低计算和记忆成本的模型压缩的主要基石。不幸的是,没有广泛探索过基因模型的修剪,而所有现有的修剪算法都存在过重的训练成本、性能退化、通用性有限或复杂的培训。为了解决这些问题,我们提议通过时间匹配分数找到一个强大的彩票。我们的实验结果表明,即使只有10%的重量还存在,所发现的亚网络的功能可以比经过训练的密集模型更接近或更好。据我们所知,我们是第一个在基因模型中显示有强力彩票并提供一种算法来找到它。我们的代码和补充材料是公开的。