当前位置: X-MOL 学术Discourse Processes › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Individual differences in expecting coherence relations: Exploring the variability in sensitivity to contextual signals in discourse
Discourse Processes ( IF 2.1 ) Pub Date : 2020-10-02 , DOI: 10.1080/0163853x.2020.1813492
Merel C. J. Scholman 1 , Vera Demberg 2 , Ted J. M. Sanders 3
Affiliation  

ABSTRACT

The current study investigated how a contextual list signal influences comprehenders’ inference generation of upcoming discourse relations and whether individual differences in working memory capacity and linguistic experience influence the generation of these inferences. Participants were asked to complete two-sentence stories, the first sentence of which contained an expression of quantity (a few, multiple). Several individual-difference measures were calculated to explore whether individual characteristics can explain the sensitivity to the contextual list signal. The results revealed that participants were sensitive to a contextual list signal (i.e., they provided list continuations), and this sensitivity was modulated by the participants’ linguistic experience, as measured by an author recognition test. The results showed no evidence that working memory affected participants’ responses. These results extend prior research by showing that contextual signals influence participants’ coherence-relation-inference generation. Further, the results of the current study emphasize the importance of individual reader characteristics when it comes to coherence-relation inferences.



中文翻译:

预期连贯关系中的个体差异:探讨话语中对上下文信号敏感性的可变性

摘要

当前的研究调查了上下文列表信号如何影响理解者对即将到来的话语关系的推理,以及工作记忆能力和语言经验的个体差异是否会影响这些推理的产生。要求参与者完成两个句子的故事,其中第一句话包含数量表达(几个,多个)。计算了几种个体差异度量,以探索个体特征是否可以解释对上下文列表信号的敏感性。结果表明,参与者对上下文列表信号很敏感(即,他们提供了列表连续性),而这种敏感性是由参与者的语言经验(通过作者识别测试来衡量)调节的。结果表明没有证据表明工作记忆会影响参与者的反应。这些结果通过显示上下文信号影响参与者的连贯性-关系-推理生成,扩展了先前的研究。此外,本研究的结果强调了在涉及相关关系推理时个人读者特征的重要性。

更新日期:2020-10-02
down
wechat
bug