Deep learning techniques have shown promising results in image compression, with competitive bitrate and image reconstruction quality from compressed latent. However, while image compression has progressed towards a higher peak signal-to-noise ratio (PSNR) and fewer bits per pixel (bpp), their robustness to adversarial images has never received deliberation. In this work, we, for the first time, investigate the robustness of image compression systems where imperceptible perturbation of input images can precipitate a significant increase in the bitrate of their compressed latent. To characterize the robustness of state-of-the-art learned image compression, we mount white-box and black-box attacks. Our white-box attack employs fast gradient sign method on the entropy estimation of the bitstream as its bitrate approximation. We propose DCT-Net simulating JPEG compression with architectural simplicity and lightweight training as the substitute in the black-box attack and enable fast adversarial transferability. Our results on six image compression models, each with six different bitrate qualities (thirty-six models in total), show that they are surprisingly fragile, where the white-box attack achieves up to 56.326x and black-box 1.947x bpp change. To improve robustness, we propose a novel compression architecture factorAtn which incorporates attention modules and a basic factorized entropy model, resulting in a promising trade-off between the rate-distortion performance and robustness to adversarial attacks that surpasses existing learned image compressors.
翻译:深层学习技术在图像压缩方面显示出了令人乐观的结果,有竞争力的比特率和图像重建质量来自压缩潜质。然而,虽然图像压缩已经向更高的峰值信号到噪音比率(PSNR)发展,每像素(bpp)的比特数较少,但其对对抗图像的稳健性从未受到审议。在这项工作中,我们首次调查图像压缩系统的稳健性,在这些系统中,输入图像的不易察觉的扰动可以促使其压缩潜值的比特率大幅上升。要突出最先进的先进图像压缩的稳健性特征,我们采用了白箱和黑箱攻击。我们的白箱攻击在对比特流的增速估计上采用了快速梯度信号方法,作为比特率的近似近似。我们建议DCTNet将JPEG压缩系统与建筑简化和轻度训练作为黑箱攻击的替代物,并能够快速进行对抗性转移。我们在六个图像压缩模型上的结果,每个模型都具有六种不同的比特级质量(总共是六种模型),我们采用了白箱和黑箱攻击的强性攻击攻击。我们的精确性模型是脆弱的,白箱攻击的精确度,从而使得Brefrecial-recial a recrestr a recredustr a recredustration ax ax ax ax ax ax ax ax ax a compregylate compregilate comp restr compal a comprestr comp restr ax ax ax ax compregil best comst compeal dection comp compal comstr abreal comption ax ax matial ax abreal ax compal res comp comp comp comp compeal comp compeal compal compeal compal compal compal compal restial compeal compeal compal compal a compal a compal restial compractal a real a restial a restial a res