Most research on pseudo relevance feedback (PRF) has been done in vector space and probabilistic retrieval models. This paper shows that Transformer-based rerankers can also benefit from the extra context that PRF provides. It presents PGT, a graph-based Transformer that sparsifies attention between graph nodes to enable PRF while avoiding the high computational complexity of most Transformer architectures. Experiments show that PGT improves upon non-PRF Transformer reranker, and it is at least as accurate as Transformer PRF models that use full attention, but with lower computational costs.
翻译:多数关于假相关反馈的研究都是在矢量空间和概率检索模型中进行的。 本文显示,基于变换器的重置器也可以从PRF提供的附加环境中受益。 它展示了基于图形的变换器PGT, 该变换器在图形节点之间引起注意,使PRF能够避免大多数变换器结构的高度计算复杂性。 实验显示,PGT改进了非PRF变换器变换器的变换器,并且至少与使用充分关注但计算成本较低的变换器 PRF模型一样准确。