Trajectory-User Linking (TUL), which links trajectories to users who generate them, has been a challenging problem due to the sparsity in check-in mobility data. Existing methods ignore the utilization of historical data or rich contextual features in check-in data, resulting in poor performance for TUL task. In this paper, we propose a novel Mutual distillation learning network to solve the TUL problem for sparse check-in mobility data, named MainTUL. Specifically, MainTUL is composed of a Recurrent Neural Network (RNN) trajectory encoder that models sequential patterns of input trajectory and a temporal-aware Transformer trajectory encoder that captures long-term time dependencies for the corresponding augmented historical trajectories. Then, the knowledge learned on historical trajectories is transferred between the two trajectory encoders to guide the learning of both encoders to achieve mutual distillation of information. Experimental results on two real-world check-in mobility datasets demonstrate the superiority of MainTUL against state-of-the-art baselines. The source code of our model is available at https://github.com/Onedean/MainTUL.
翻译:将轨迹与生成轨迹的用户连接起来的轨迹链接(TUL)是一个具有挑战性的问题,原因是报到流动数据的宽度。现有方法忽略了在报到数据中使用历史数据或丰富的背景特征,导致TUL任务的性能差。在本文件中,我们建议建立一个名为MainTUL的新型“相互蒸馏学习网络”网络,以解决关于稀疏的报到流动数据的TUL问题。具体地说,MainTUL是一个经常性神经网络轨迹编码器,该编码器模拟输入轨迹的顺序模式和时间觉变换轨迹的轨迹编码器,为相应扩充的历史轨迹采集长期依赖性。然后,在两个轨迹编码器之间转让关于历史轨迹的知识,以指导两个编码器的学习,从而实现信息的相互蒸馏。两个实体校验流动数据集的实验结果显示MainTUL的优越性相对于州-艺术基线。我们的模型源代码可在 http://mas/Maginturs。