Adversarial attacks can mislead deep learning models to make false predictions by implanting small perturbations to the original input that are imperceptible to the human eye, which poses a huge security threat to the computer vision systems based on deep learning. Physical adversarial attacks, which is more realistic, as the perturbation is introduced to the input before it is being captured and converted to a binary image inside the vision system, when compared to digital adversarial attacks. In this paper, we focus on physical adversarial attacks and further classify them into invasive and non-invasive. Optical-based physical adversarial attack techniques (e.g. using light irradiation) belong to the non-invasive category. As the perturbations can be easily ignored by humans as the perturbations are very similar to the effects generated by a natural environment in the real world. They are highly invisibility and executable and can pose a significant or even lethal threats to real systems. This paper focuses on optical-based physical adversarial attack techniques for computer vision systems, with emphasis on the introduction and discussion of optical-based physical adversarial attack techniques.
翻译:----
对抗攻击可以通过给原始输入植入难以察觉的小扰动来欺骗深度学习模型做出错误的预测,这对基于深度学习的计算机视觉系统构成了巨大的安全威胁。物理对抗攻击更为现实,因为扰动是在输入被捕获并在视觉系统内部被转换为二进制图像之前引入的,与数字对抗攻击相比更为现实。本文着重于物理对抗攻击,并将其进一步分类为入侵性和非入侵性。光学物理对抗攻击技术(例如使用光辐照)属于非侵入性范畴。由于扰动可能很容易被人类忽略,因为它们与真实世界中自然环境产生的效果非常相似,所以它们具有高度的隐形性和可执行性,并且可能对真实系统构成重大甚至致命的威胁。本文重点介绍了光学物理对抗攻击技术,以及对其进行讨论。