The recently proposed Neural Local Lossless Compression (NeLLoC), which is based on a local autoregressive model, has achieved state-of-the-art (SOTA) out-of-distribution (OOD) generalization performance in the image compression task. In addition to the encouragement of OOD generalization, the local model also allows parallel inference in the decoding stage. In this paper, we propose a parallelization scheme for local autoregressive models. We discuss the practicalities of implementing this scheme, and provide experimental evidence of significant gains in compression runtime compared to the previous, non-parallel implementation.
翻译:最近提出的以本地自动递减模式为基础的神经局部无损压缩(NELLoC),在图像压缩任务中实现了最先进的(SOTA)超出分布性(OOOD)的概括性工作,除了鼓励OOOD一般化外,当地模型还允许在解码阶段平行推论。在本文件中,我们提出了本地自动递减模型的平行计划。我们讨论了实施该计划的实用性,并提供了实验性证据,表明压缩运行时间比以前的非平行实施时间大增。