当前位置: X-MOL 学术arXiv.cs.HC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Simulating the Effects of Social Presence on Trust, Privacy Concerns & Usage Intentions in Automated Bots for Finance
arXiv - CS - Human-Computer Interaction Pub Date : 2020-06-27 , DOI: arxiv-2006.15449
Magdalene Ng, Kovila P.L. Coopamootoo, Ehsan Toreini, Mhairi Aitken, Karen Elliot, Aad van Moorsel

FinBots are chatbots built on automated decision technology, aimed to facilitate accessible banking and to support customers in making financial decisions. Chatbots are increasing in prevalence, sometimes even equipped to mimic human social rules, expectations and norms, decreasing the necessity for human-to-human interaction. As banks and financial advisory platforms move towards creating bots that enhance the current state of consumer trust and adoption rates, we investigated the effects of chatbot vignettes with and without socio-emotional features on intention to use the chatbot for financial support purposes. We conducted a between-subject online experiment with N = 410 participants. Participants in the control group were provided with a vignette describing a secure and reliable chatbot called XRO23, whereas participants in the experimental group were presented with a vignette describing a secure and reliable chatbot that is more human-like and named Emma. We found that Vignette Emma did not increase participants' trust levels nor lowered their privacy concerns even though it increased perception of social presence. However, we found that intention to use the presented chatbot for financial support was positively influenced by perceived humanness and trust in the bot. Participants were also more willing to share financially-sensitive information such as account number, sort code and payments information to XRO23 compared to Emma - revealing a preference for a technical and mechanical FinBot in information sharing. Overall, this research contributes to our understanding of the intention to use chatbots with different features as financial technology, in particular that socio-emotional support may not be favoured when designed independently of financial function.

中文翻译:

在金融自动化机器人中模拟社会存在对信任、隐私问题和使用意图的影响

FinBots 是建立在自动决策技术之上的聊天机器人,旨在促进无障碍银行业务并支持客户做出财务决策。聊天机器人越来越流行,有时甚至可以模仿人类社会规则、期望和规范,从而降低了人与人互动的必要性。随着银行和金融咨询平台转向创建提高消费者信任度和采用率当前状态的机器人,我们调查了带有和不带有社交情感特征的聊天机器人小插曲对将聊天机器人用于财务支持目的的意图的影响。我们与 N = 410 名参与者进行了一个受试者之间的在线实验。对照组的参与者获得了一个小插图,描述了一个名为 XRO23 的安全可靠的聊天机器人,而实验组的参与者会看到一个小插图,描述了一个安全可靠的聊天机器人,它更像人类,名为 Emma。我们发现 Vignette Emma 并没有提高参与者的信任水平,也没有降低他们对隐私的担忧,尽管它增加了对社交存在的感知。然而,我们发现使用所提供的聊天机器人进行财务支持的意图受到感知到的人性和对机器人的信任的积极影响。与 Emma 相比,参与者也更愿意向 XRO23 共享财务敏感信息,例如帐号、分类代码和付款信息——这表明在信息共享方面更偏爱技术和机械 FinBot。全面的,
更新日期:2020-07-06
down
wechat
bug