Detecting payment fraud in real-world banking streams requires models that can exploit both the order of events and the irregular time gaps between them. We introduce FraudTransformer, a sequence model that augments a vanilla GPT-style architecture with (i) a dedicated time encoder that embeds either absolute timestamps or inter-event values, and (ii) a learned positional encoder that preserves relative order. Experiments on a large industrial dataset -- tens of millions of transactions and auxiliary events -- show that FraudTransformer surpasses four strong classical baselines (Logistic Regression, XGBoost and LightGBM) as well as transformer ablations that omit either the time or positional component. On the held-out test set it delivers the highest AUROC and PRAUC.
翻译:在现实银行交易流中检测支付欺诈需要能够同时利用事件顺序及事件间不规则时间间隔的模型。本文提出FraudTransformer,一种序列模型,其在标准GPT架构基础上增强了两项机制:(i)专用时间编码器,可嵌入绝对时间戳或事件间间隔值;(ii)可学习的相对位置编码器,用于保持事件顺序关系。基于大规模工业数据集(包含数千万笔交易及辅助事件)的实验表明,FraudTransformer在性能上超越了四种经典基线模型(逻辑回归、XGBoost与LightGBM)以及省略时间或位置编码组件的Transformer消融模型。在独立测试集上,该模型取得了最高的AUROC与PRAUC指标。