当前位置: X-MOL 学术Topics in Cognitive Science › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Editors' Review and Introduction: Learning Grammatical Structures: Developmental, Cross-Species, and Computational Approaches.
Topics in Cognitive Science ( IF 3.265 ) Pub Date : 2020-03-05 , DOI: 10.1111/tops.12493
Carel Ten Cate 1, 2 , Judit Gervain 3, 4 , Clara C Levelt 2, 5 , Christopher I Petkov 6 , Willem Zuidema 7
Affiliation  

Human languages all have a grammar, that is, rules that determine how symbols in a language can be combined to create complex meaningful expressions. Despite decades of research, the evolutionary, developmental, cognitive, and computational bases of grammatical abilities are still not fully understood. “Artificial Grammar Learning” (AGL) studies provide important insights into how rules and structured sequences are learned, the relevance of these processes to language in humans, and whether the cognitive systems involved are shared with other animals. AGL tasks can be used to study how human adults, infants, animals, or machines learn artificial grammars of various sorts, consisting of rules defined typically over syllables, sounds, or visual items. In this introduction, we distill some lessons from the nine other papers in this special issue, which review the advances made from this growing body of literature. We provide a critical synthesis, identify the questions that remain open, and recognize the challenges that lie ahead. A key observation across the disciplines is that the limits of human, animal, and machine capabilities have yet to be found. Thus, this interdisciplinary area of research firmly rooted in the cognitive sciences has unearthed exciting new questions and venues for research, along the way fostering impactful collaborations between traditionally disconnected disciplines that are breaking scientific ground.

中文翻译:

编辑评论和介绍:学习语法结构:发展、跨物种和计算方法。

人类语言都有语法,即确定如何组合语言中的符号以创建复杂有意义的表达式的规则。尽管进行了数十年的研究,但语法能力的进化、发展、认知和计算基础仍未完全了解。“人工语法学习”(AGL)研究提供了关于如何学习规则和结构化序列、这些过程与人类语言的相关性以及所涉及的认知系统是否与其他动物共享的重要见解。AGL 任务可用于研究人类成人、婴儿、动物或机器如何学习各种人工语法,包括通常在音节、声音或视觉项目上定义的规则。在这个引言中,我们从本期特刊的其他九篇论文中提炼出一些经验教训,它回顾了从这一不断增长的文献中取得的进展。我们提供了一个批判性的综合,确定了仍然悬而未决的问题,并认识到摆在面前的挑战。跨学科的一个关键观察结果是,人类、动物和机器能力的极限尚未被发现。因此,这个深深植根于认知科学的跨学科研究领域发现了令人兴奋的新问题和研究场所,同时促进了传统上不连贯的学科之间的有效合作,这些学科正在打破科学领域。和机器功能尚未被发现。因此,这个深深植根于认知科学的跨学科研究领域发现了令人兴奋的新问题和研究场所,同时促进了传统上不连贯的学科之间的有效合作,这些学科正在打破科学领域。和机器功能尚未被发现。因此,这个深深植根于认知科学的跨学科研究领域发现了令人兴奋的新问题和研究场所,同时促进了传统上不连贯的学科之间的有效合作,这些学科正在打破科学领域。
更新日期:2020-03-05
down
wechat
bug