Predictive business process monitoring focuses on predicting future characteristics of a running process using event logs. The foresight into process execution promises great potentials for efficient operations, better resource management, and effective customer services. Deep learning-based approaches have been widely adopted in process mining to address the limitations of classical algorithms for solving multiple problems, especially the next event and remaining-time prediction tasks. Nevertheless, designing a deep neural architecture that performs competitively across various tasks is challenging as existing methods fail to capture long-range dependencies in the input sequences and perform poorly for lengthy process traces. In this paper, we propose ProcessTransformer, an approach for learning high-level representations from event logs with an attention-based network. Our model incorporates long-range memory and relies on a self-attention mechanism to establish dependencies between a multitude of event sequences and corresponding outputs. We evaluate the applicability of our technique on nine real event logs. We demonstrate that the transformer-based model outperforms several baselines of prior techniques by obtaining on average above 80% accuracy for the task of predicting the next activity. Our method also perform competitively, compared to baselines, for the tasks of predicting event time and remaining time of a running case
翻译:预测性的业务流程监测侧重于预测使用事件日志的运行过程的未来特征。 展望进程执行过程的展望将带来高效率运作、更好的资源管理和有效的客户服务的巨大潜力。 深层次的学习方法在开采过程中被广泛采用,以解决解决多种问题,特别是下一个事件和剩余时间预测任务的典型算法的局限性。然而,设计一个在各种任务中进行竞争性运行的深层神经结构具有挑战性,因为现有方法未能在输入序列中捕捉长期依赖性,而且运行过程痕量也差强人意。 在本文件中,我们提议了程序Transerector,这是一种从关注网络的活动日志中学习高级别代表的方法。我们的模型包含长距离记忆,并依靠自用机制在众多事件序列和相应产出之间建立依赖性。我们评估我们技术在九个实际事件日志上的适用性。 我们证明,基于变压器的模型通过平均获得80%以上的准确度来预测下一个活动的任务,超越了先前技术的若干基线。我们的方法还具有竞争性地运行,与基线比较,用于预测事件的基线和基准。