当前位置: X-MOL 学术Inf. Process. Manag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hierarchical neural query suggestion with an attention mechanism
Information Processing & Management ( IF 7.4 ) Pub Date : 2019-05-18 , DOI: 10.1016/j.ipm.2019.05.001
Wanyu Chen , Fei Cai , Honghui Chen , Maarten de Rijke

Query suggestions help users of a search engine to refine their queries. Previous work on query suggestion has mainly focused on incorporating directly observable features such as query co-occurrence and semantic similarity. The structure of such features is often set manually, as a result of which hidden dependencies between queries and users may be ignored. We propose an Attention-based Hierarchical Neural Query Suggestion (AHNQS) model that uses an attention mechanism to automatically capture user preferences. AHNQS combines a session-level neural network and a user-level neural network into a hierarchical structure to model the short- and long-term search history of a user. We quantify the improvements of AHNQS over state-of-the-art recurrent neural network-based query suggestion baselines on the AOL query log dataset, with improvements of up to 9.66% and 12.51% in terms of Recall@10 and MRR@10, respectively; improvements are especially obvious for short sessions and inactive users with few search sessions.



中文翻译:

带有注意机制的分层神经查询建议

查询建议可帮助搜索引擎的用户优化其查询。先前有关查询建议的工作主要集中在合并直接可观察的功能,例如查询共现和语义相似性。此类功能的结构通常是手动设置的,因此可能会忽略查询和用户之间的隐藏依赖性。我们提出了一种基于注意力的层次神经查询建议(AHNQS)模型,该模型使用一种注意力机制来自动捕获用户偏好。AHNQS将会话级神经网络和用户级神经网络组合到一个层次结构中,以对用户的短期和长期搜索历史进行建模。我们在AOL查询日志数据集上量化了AHNQS在基于最新递归神经网络的查询建议基线上的改进,在Recall @ 10和MRR @ 10方面分别提高了9.66%和12.51%;对于短会话和搜索会话很少的不活跃用户而言,改进尤其明显。

更新日期:2019-05-18
down
wechat
bug