The neural Hawkes process (Mei & Eisner, 2017) is a generative model of irregularly spaced sequences of discrete events. To handle complex domains with many event types, Mei et al. (2020a) further consider a setting in which each event in the sequence updates a deductive database of facts (via domain-specific pattern-matching rules); future events are then conditioned on the database contents. They show how to convert such a symbolic system into a neuro-symbolic continuous-time generative model, in which each database fact and the possible event has a time-varying embedding that is derived from its symbolic provenance. In this paper, we modify both models, replacing their recurrent LSTM-based architectures with flatter attention-based architectures (Vaswani et al., 2017), which are simpler and more parallelizable. This does not appear to hurt our accuracy, which is comparable to or better than that of the original models as well as (where applicable) previous attention-based methods (Zuo et al., 2020; Zhang et al., 2020a).
翻译:神经鹰进程( Mei & Eisner, 2017年) 是不同事件不定期间距序列的基因模型。 要处理许多事件类型的复杂领域, Mei 等人 (2020年a) 进一步考虑一个环境,让每个事件在序列中更新事实的推理数据库(通过特定领域模式的比对规则); 未来事件则以数据库内容为条件。 它们显示如何将这种象征性系统转换成神经- 同步连续时间基因模型,其中每个数据库的事实和可能事件都有从其象征性出处得出的时间变化嵌入。 在本文中,我们修改两种模型,将其经常性的LSTM结构替换为平坦关注型结构(Vaswani等人, 2017年),这些结构更简单、更平行。 这似乎不会损害我们的准确性,因为它与原始模型相似或更好,以及(在适用的情况下)以前的关注方法(Zuo等人,2020年;Zhang等人,2020年a)。