当前位置: X-MOL 学术arXiv.cs.IR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Contrastive Learning for Cold-Start Recommendation
arXiv - CS - Information Retrieval Pub Date : 2021-07-12 , DOI: arxiv-2107.05315
Yinwei Wei, Xiang Wang, Qi Li, Liqiang Nie, Yan Li, Xuanping Li, Tat-Seng Chua

Recommending cold-start items is a long-standing and fundamental challenge in recommender systems. Without any historical interaction on cold-start items, CF scheme fails to use collaborative signals to infer user preference on these items. To solve this problem, extensive studies have been conducted to incorporate side information into the CF scheme. Specifically, they employ modern neural network techniques (e.g., dropout, consistency constraint) to discover and exploit the coalition effect of content features and collaborative representations. However, we argue that these works less explore the mutual dependencies between content features and collaborative representations and lack sufficient theoretical supports, thus resulting in unsatisfactory performance. In this work, we reformulate the cold-start item representation learning from an information-theoretic standpoint. It aims to maximize the mutual dependencies between item content and collaborative signals. Specifically, the representation learning is theoretically lower-bounded by the integration of two terms: mutual information between collaborative embeddings of users and items, and mutual information between collaborative embeddings and feature representations of items. To model such a learning process, we devise a new objective function founded upon contrastive learning and develop a simple yet effective Contrastive Learning-based Cold-start Recommendation framework(CLCRec). In particular, CLCRec consists of three components: contrastive pair organization, contrastive embedding, and contrastive optimization modules. It allows us to preserve collaborative signals in the content representations for both warm and cold-start items. Through extensive experiments on four publicly accessible datasets, we observe that CLCRec achieves significant improvements over state-of-the-art approaches in both warm- and cold-start scenarios.

中文翻译:

冷启动推荐的对比学习

推荐冷启动项目是推荐系统中长期存在的基本挑战。没有冷启动项目的任何历史交互,CF 方案无法使用协作信号来推断用户对这些项目的偏好。为了解决这个问题,已经进行了广泛的研究以将边信息合并到 CF 方案中。具体来说,他们采用现代神经网络技术(例如,dropout、一致性约束)来发现和利用内容特征和协作表示的联合效应。然而,我们认为这些工作较少探索内容特征和协作表示之间的相互依赖关系,缺乏足够的理论支持,从而导致性能不理想。在这项工作中,我们从信息论的角度重新制定了冷启动项目表示学习。它旨在最大化项目内容和协作信号之间的相互依赖关系。具体来说,表征学习理论上是通过两个术语的整合来下限的:用户和项目的协作嵌入之间的互信息,以及项目的协作嵌入和特征表示之间的互信息。为了模拟这样的学习过程,我们设计了一个基于对比学习的新目标函数,并开发了一个简单而有效的基于对比学习的冷启动推荐框架(CLCRec)。特别地,CLCRec 由三个组件组成:对比对组织、对比嵌入和对比优化模块。它允许我们在热启动和冷启动项目的内容表示中保留协作信号。通过对四个可公开访问的数据集的广泛实验,我们观察到 CLCRec 在热启动和冷启动场景中都比最先进的方法实现了显着改进。
更新日期:2021-07-13
down
wechat
bug