当前位置: X-MOL 学术arXiv.cs.IR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
PGT: Pseudo Relevance Feedback Using a Graph-Based Transformer
arXiv - CS - Information Retrieval Pub Date : 2021-01-20 , DOI: arxiv-2101.07918
HongChien Yu, Zhuyun Dai, Jamie Callan

Most research on pseudo relevance feedback (PRF) has been done in vector space and probabilistic retrieval models. This paper shows that Transformer-based rerankers can also benefit from the extra context that PRF provides. It presents PGT, a graph-based Transformer that sparsifies attention between graph nodes to enable PRF while avoiding the high computational complexity of most Transformer architectures. Experiments show that PGT improves upon non-PRF Transformer reranker, and it is at least as accurate as Transformer PRF models that use full attention, but with lower computational costs.

中文翻译:

PGT:使用基于图的变压器的伪相关反馈

伪相关反馈(PRF)的大多数研究都是在向量空间和概率检索模型中完成的。本文表明,基于Transformer的重编程序也可以从PRF提供的额外上下文中受益。它介绍了PGT,这是一种基于图的Transformer,它可以减轻图节点之间的注意力以启用PRF,同时避免大多数Transformer体系结构的高计算复杂性。实验表明,PGT在非PRF Transformer重新排序器的基础上有所改进,它的准确性至少与全神贯注的Transformer PRF模型一样,但计算成本较低。
更新日期:2021-01-21
down
wechat
bug