The recently proposed Neural Local Lossless Compression (NeLLoC), which is based on a local autoregressive model, has achieved state-of-the-art (SOTA) out-of-distribution (OOD) generalization performance in the image compression task. In addition to the encouragement of OOD generalization, the local model also allows parallel inference in the decoding stage. In this paper, we propose two parallelization schemes for local autoregressive models. We discuss the practicalities of implementing the schemes and provide experimental evidence of significant gains in compression runtime compared to the previous, non-parallel implementation.
翻译:最近提出的以当地自动递减模式为基础的神经局部无损压缩(NELLoC),在图像压缩任务中实现了最先进的(SOTA)超出分布性(OOOD)的概括性工作,除了鼓励OOOD一般化外,当地模型还允许在解码阶段平行推论。在本文件中,我们为地方自动递减模式提出了两个平行计划。我们讨论了实施这些计划的可行性,并提供了实验性证据,表明压缩运行时间比以前的非平行实施时间大增。