Benefiting from the development of Deep Neural Networks, Multi-Object Tracking (MOT) has achieved aggressive progress. Currently, the real-time Joint-Detection-Tracking (JDT) based MOT trackers gain increasing attention and derive many excellent models. However, the robustness of JDT trackers is rarely studied, and it is challenging to attack the MOT system since its mature association algorithms are designed to be robust against errors during tracking. In this work, we analyze the weakness of JDT trackers and propose a novel adversarial attack method, called Tracklet-Switch (TraSw), against the complete tracking pipeline of MOT. Specifically, a push-pull loss and a center leaping optimization are designed to generate adversarial examples for both re-ID feature and object detection. TraSw can fool the tracker to fail to track the targets in the subsequent frames by attacking very few frames. We evaluate our method on the advanced deep trackers (i.e., FairMOT, JDE, ByteTrack) using the MOT-Challenge datasets (i.e., 2DMOT15, MOT17, and MOT20). Experiments show that TraSw can achieve a high success rate of over 95% by attacking only five frames on average for the single-target attack and a reasonably high success rate of over 80% for the multiple-target attack. The code is available at https://github.com/DerryHub/FairMOT-attack .
翻译:从深心网络的开发中受益的多目标跟踪(MOT)取得了积极的进展。目前,实时联合检测-跟踪(JDT)基于MOT(JDT)的实时联合检测-跟踪跟踪(JDT)跟踪器日益受到关注,并产生了许多极好的模型。然而,对JDT跟踪器的稳健性研究很少,攻击MOT系统具有挑战性,因为其成熟的关联算法设计得在跟踪过程中的错误方面是强大的。在这项工作中,我们分析了JDT追踪器的弱点,并提出了一个新的对抗性攻击方法,称为Chatlet-Switch(TraSww),以对抗MOT的完整跟踪管道。具体地说,推拉-拉式损失和中心跳式优化旨在生成对重新识别特性和物体探测的对抗性实例。TraSwt跟踪器能够通过很少的框来欺骗它无法在随后的框架中追踪目标。我们关于高级跟踪者的方法(e.ub. FairMOT,JDE, BetTrack)使用M-Challenge 数据集(i.e.e-Creal acreaction acreal laudal laudal laudal liver) laus access access access abregres access access agregres a laveal laveal laveal laveal lad lators a laus a labal ladal lades) lavedal amdal amdal amdal acumental ably lautal amdddddddal amdal abdal amdal abal ams abal abal abal abs abs abal amdal amdal amdal ams) minaldaldaldaldalaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldaldal am am am am am am am am am