当前位置: X-MOL 学术arXiv.cs.IR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Asymmetrical Hierarchical Networks with Attentive Interactions for Interpretable Review-Based Recommendation
arXiv - CS - Information Retrieval Pub Date : 2019-12-18 , DOI: arxiv-2001.04346
Xin Dong, Jingchao Ni, Wei Cheng, Zhengzhang Chen, Bo Zong, Dongjin Song, Yanchi Liu, Haifeng Chen, Gerard de Melo

Recently, recommender systems have been able to emit substantially improved recommendations by leveraging user-provided reviews. Existing methods typically merge all reviews of a given user or item into a long document, and then process user and item documents in the same manner. In practice, however, these two sets of reviews are notably different: users' reviews reflect a variety of items that they have bought and are hence very heterogeneous in their topics, while an item's reviews pertain only to that single item and are thus topically homogeneous. In this work, we develop a novel neural network model that properly accounts for this important difference by means of asymmetric attentive modules. The user module learns to attend to only those signals that are relevant with respect to the target item, whereas the item module learns to extract the most salient contents with regard to properties of the item. Our multi-hierarchical paradigm accounts for the fact that neither are all reviews equally useful, nor are all sentences within each review equally pertinent. Extensive experimental results on a variety of real datasets demonstrate the effectiveness of our method.

中文翻译:

具有注意力交互的非对称分层网络,用于可解释的基于评论的推荐

最近,推荐系统已经能够通过利用用户提供的评论来提供显着改进的推荐。现有方法通常将给定用户或项目的所有评论合并到一个长文档中,然后以相同的方式处理用户和项目文档。然而,在实践中,这两组评论明显不同:用户的评论反映了他们购买的各种商品,因此他们的主题非常不同,而一个项目的评论仅与单个项目有关,因此在主题上是同质的. 在这项工作中,我们开发了一种新颖的神经网络模型,该模型通过非对称注意力模块正确解释了这一重要差异。用户模块学习只关注那些与目标项目相关的信号,而 item 模块学习提取关于 item 属性的最显着的内容。我们的多层次范式解释了这样一个事实,即所有评论都同样有用,每个评论中的所有句子也同样相关。在各种真实数据集上的大量实验结果证明了我们方法的有效性。
更新日期:2020-01-14
down
wechat
bug