当前位置: X-MOL 学术arXiv.cs.IR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Open-Domain Conversational Search Assistant with Transformers
arXiv - CS - Information Retrieval Pub Date : 2021-01-20 , DOI: arxiv-2101.08197
Rafael Ferreira, Mariana Leite, David Semedo, Joao Magalhaes

Open-domain conversational search assistants aim at answering user questions about open topics in a conversational manner. In this paper we show how the Transformer architecture achieves state-of-the-art results in key IR tasks, leveraging the creation of conversational assistants that engage in open-domain conversational search with single, yet informative, answers. In particular, we propose an open-domain abstractive conversational search agent pipeline to address two major challenges: first, conversation context-aware search and second, abstractive search-answers generation. To address the first challenge, the conversation context is modeled with a query rewriting method that unfolds the context of the conversation up to a specific moment to search for the correct answers. These answers are then passed to a Transformer-based re-ranker to further improve retrieval performance. The second challenge, is tackled with recent Abstractive Transformer architectures to generate a digest of the top most relevant passages. Experiments show that Transformers deliver a solid performance across all tasks in conversational search, outperforming the best TREC CAsT 2019 baseline.

中文翻译:

带变压器的开放域会话搜索助手

开放域对话搜索助手旨在以对话方式回答用户有关开放主题的问题。在本文中,我们展示了Transformer体系结构如何利用关键对话任务的创建,从而利用对话助手创建具有单个但信息丰富的答案的开放域对话搜索,从而在关键的IR任务中实现了最先进的结果。特别是,我们提出了一种开放域的抽象对话搜索代理管道,以解决两个主要挑战:第一,对话上下文感知搜索和第二,抽象搜索答案生成。为了解决第一个挑战,使用查询重写方法对会话上下文进行建模,该方法将会话上下文展开到特定时刻以搜索正确答案。然后,这些答案将传递到基于Transformer的重新排序器,以进一步提高检索性能。第二个挑战是使用最新的Abstractive Transformer体系结构来解决,以生成最相关的文章摘要。实验表明,《变形金刚》在会话搜索的所有任务中均提供了稳定的性能,优于最佳的TREC CAsT 2019基准。
更新日期:2021-01-21
down
wechat
bug