当前位置: X-MOL 学术arXiv.cs.AI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
BERT-QE: Contextualized Query Expansion for Document Re-ranking
arXiv - CS - Artificial Intelligence Pub Date : 2020-09-15 , DOI: arxiv-2009.07258
Zhi Zheng, Kai Hui, Ben He, Xianpei Han, Le Sun, Andrew Yates

Query expansion aims to mitigate the mismatch between the language used in a query and in a document. Query expansion methods can suffer from introducing non-relevant information when expanding the query, however. To bridge this gap, inspired by recent advances in applying contextualized models like BERT to the document retrieval task, this paper proposes a novel query expansion model that leverages the strength of the BERT model to better select relevant information for expansion. In evaluations on the standard TREC Robust04 and GOV2 test collections, the proposed BERT-QE model significantly outperforms BERT-Large models commonly used for document retrieval.

中文翻译:

BERT-QE:用于文档重新排名的上下文查询扩展

查询扩展旨在减轻查询和文档中使用的语言之间的不匹配。但是,查询扩展方法可能会在扩展查询时引入不相关的信息。为了弥合这种差距,受最近将BERT等上下文模型应用到文档检索任务中的启发所启发,本文提出了一种新颖的查询扩展模型,该模型利用BERT模型的优势来更好地选择要扩展的相关信息。在对标准TREC Robust04和GOV2测试集合的评估中,提出的BERT-QE模型大大优于通常用于文档检索的BERT-Large模型。
更新日期:2020-09-16
down
wechat
bug