当前位置: X-MOL 学术arXiv.cs.IR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Augmenting Sequential Recommendation with Pseudo-Prior Items via Reversely Pre-training Transformer
arXiv - CS - Information Retrieval Pub Date : 2021-05-02 , DOI: arxiv-2105.00522
Zhiwei Liu, Ziwei Fan, Yu Wang, Philip S. Yu

Sequential Recommendation characterizes the evolving patterns by modeling item sequences chronologically. The essential target of it is to capture the item transition correlations. The recent developments of transformer inspire the community to design effective sequence encoders, \textit{e.g.,} SASRec and BERT4Rec. However, we observe that these transformer-based models suffer from the cold-start issue, \textit{i.e.,} performing poorly for short sequences. Therefore, we propose to augment short sequences while still preserving original sequential correlations. We introduce a new framework for \textbf{A}ugmenting \textbf{S}equential \textbf{Re}commendation with \textbf{P}seudo-prior items~(ASReP). We firstly pre-train a transformer with sequences in a reverse direction to predict prior items. Then, we use this transformer to generate fabricated historical items at the beginning of short sequences. Finally, we fine-tune the transformer using these augmented sequences from the time order to predict the next item. Experiments on two real-world datasets verify the effectiveness of ASReP. The code is available on \url{https://github.com/DyGRec/ASReP}.

中文翻译:

通过反向预训练变压器使用伪优先项增强顺序推荐

顺序建议通过按时间顺序对项目序列进行建模来表征不断发展的模式。它的基本目标是捕获项目过渡相关性。变压器的最新发展启发了社区设计有效的序列编码器\ textit {例如} SASRec和BERT4Rec。但是,我们观察到这些基于变压器的模型存在冷启动问题,\ textit {ie}在短序列中的性能较差。因此,我们建议在增强短序列的同时仍保留原始顺序相关性。我们引入了一个新的框架,用于用\ textbf {P}伪优先项〜(ASReP)增强\ textbf {A}对\ textbf {S}顺序\ textbf {Re}的推荐。我们首先以相反的顺序对变压器进行预训练,以预测先前的项目。然后,我们使用此转换器在短序列开始时生成虚构的历史项目。最后,我们使用时间顺序中的这些增强序列对变压器进行微调,以预测下一项。在两个真实世界的数据集上进行的实验验证了ASReP的有效性。可以在\ url {https://github.com/DyGRec/ASReP}上找到该代码。
更新日期:2021-05-04
down
wechat
bug