当前位置: X-MOL 学术arXiv.cs.IR › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Top Comment or Flop Comment? Predicting and Explaining User Engagement in Online News Discussions
arXiv - CS - Information Retrieval Pub Date : 2020-03-26 , DOI: arxiv-2003.11949
Julian Risch, Ralf Krestel

Comment sections below online news articles enjoy growing popularity among readers. However, the overwhelming number of comments makes it infeasible for the average news consumer to read all of them and hinders engaging discussions. Most platforms display comments in chronological order, which neglects that some of them are more relevant to users and are better conversation starters. In this paper, we systematically analyze user engagement in the form of the upvotes and replies that a comment receives. Based on comment texts, we train a model to distinguish comments that have either a high or low chance of receiving many upvotes and replies. Our evaluation on user comments from TheGuardian.com compares recurrent and convolutional neural network models, and a traditional feature-based classifier. Further, we investigate what makes some comments more engaging than others. To this end, we identify engagement triggers and arrange them in a taxonomy. Explanation methods for neural networks reveal which input words have the strongest influence on our model's predictions. In addition, we evaluate on a dataset of product reviews, which exhibit similar properties as user comments, such as featuring upvotes for helpfulness.

中文翻译:

热门评论还是翻牌评论?在线新闻讨论中预测和解释用户参与度

在线新闻文章下方的评论部分在读者中越来越受欢迎。然而,压倒性的评论数量使得普通新闻消费者无法阅读所有评论并阻碍参与性讨论。大多数平台按时间顺序显示评论,这忽略了其中一些与用户更相关并且是更好的对话开始者。在本文中,我们以评论收到的点赞和回复的形式系统地分析了用户参与度。基于评论文本,我们训练一个模型来区分收到许多赞成和回复的可能性高或低的评论。我们对来自 TheGuardian.com 的用户评论的评估比较了循环和卷积神经网络模型以及传统的基于特征的分类器。更多,我们调查是什么让某些评论比其他评论更具吸引力。为此,我们确定了参与触发器并将它们排列在分类法中。神经网络的解释方法揭示了哪些输入词对我们模型的预测影响最大。此外,我们对产品评论数据集进行了评估,该数据集表现出与用户评论类似的属性,例如对有用的支持。
更新日期:2020-03-27
down
wechat
bug