当前位置: X-MOL 学术arXiv.cs.IR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Assessing the Benefits of Model Ensembles in Neural Re-Ranking for Passage Retrieval
arXiv - CS - Information Retrieval Pub Date : 2021-01-21 , DOI: arxiv-2101.08705
Luís Borges, Bruno Martins, Jamie Callan

Our work aimed at experimentally assessing the benefits of model ensembling within the context of neural methods for passage reranking. Starting from relatively standard neural models, we use a previous technique named Fast Geometric Ensembling to generate multiple model instances from particular training schedules, then focusing or attention on different types of approaches for combining the results from the multiple model instances (e.g., averaging the ranking scores, using fusion methods from the IR literature, or using supervised learning-to-rank). Tests with the MS-MARCO dataset show that model ensembling can indeed benefit the ranking quality, particularly with supervised learning-to-rank although also with unsupervised rank aggregation.

中文翻译:

评估模型集成在重新排序神经以进行通道检索中的优势

我们的工作旨在通过实验评估在神经方法进行段落重新排名的背景下模型集成的好处。从相对标准的神经模型开始,我们使用一种名为快速几何整合的先前技术从特定的训练计划中生成多个模型实例,然后将注意力或注意力集中在将来自多个模型实例的结果进行组合的不同类型的方法上(例如,对排名进行平均)分数,使用IR文献中的融合方法,或使用有监督的学习排名方法。使用MS-MARCO数据集进行的测试表明,模型集成确实可以使排名质量受益,特别是在有监督的等级学习的情况下,尽管在无监督的排名汇总的情况下也是如此。
更新日期:2021-01-22
down
wechat
bug