APT detection is difficult to detect due to the long-term latency, covert and slow multistage attack patterns of Advanced Persistent Threat (APT). To tackle these issues, we propose TBDetector, a transformer-based advanced persistent threat detection method for APT attack detection. Considering that provenance graphs provide rich historical information and have the powerful attacks historic correlation ability to identify anomalous activities, TBDetector employs provenance analysis for APT detection, which summarizes long-running system execution with space efficiency and utilizes transformer with self-attention based encoder-decoder to extract long-term contextual features of system states to detect slow-acting attacks. Furthermore, we further introduce anomaly scores to investigate the anomaly of different system states, where each state is calculated with an anomaly score corresponding to its similarity score and isolation score. To evaluate the effectiveness of the proposed method, we have conducted experiments on five public datasets, i.e., streamspot, cadets, shellshock, clearscope, and wget_baseline. Experimental results and comparisons with state-of-the-art methods have exhibited better performance of our proposed method.
翻译:APT 检测由于Advanced Persistent Threat(APT)攻击具有长期延迟、隐秘性和缓慢的多阶段攻击模式而难以检测。为了解决这些问题,本文提出了TBDetector,一种基于 transformer 的高级持续性威胁检测方法,使用可溯源图来实现纵向关系挖掘,并提取状态的长期上下文特征来检测缓慢攻击。此外,我们提出异常得分来检测不同状态的异常性,其中每个状态的异常得分包括相似度得分和孤立得分。我们在 five public datasets上进行了实验,即streamspot,cadets,shellshock,clearscope 和 wget_baseline。实验结果和与最先进的方法相比表明,我们提出的方法具有更好的性能。