当前位置: X-MOL 学术Language Learning and Development › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Patterns Bit by Bit. An Entropy Model for Rule Induction
Language Learning and Development ( IF 1.480 ) Pub Date : 2019-12-11 , DOI: 10.1080/15475441.2019.1695620
Silvia Radulescu 1 , Frank Wijnen 1 , Sergey Avrutin 1
Affiliation  

ABSTRACT

From limited evidence, children track the regularities of their language impressively fast and they infer generalized rules that apply to novel instances. This study investigated what drives the inductive leap from memorizing specific items and statistical regularities to extracting abstract rules. We propose an innovative entropy model that offers one consistent information-theoretic account for both learning the regularities in the input and generalizing to new input. The model predicts that rule induction is an encoding mechanism gradually driven as a natural automatic reaction by the brain’s sensitivity to the input complexity (entropy) interacting with the finite encoding power of the human brain (channel capacity). In two artificial grammar experiments with adults we probed the effect of input complexity on rule induction. Results showed that as the input becomes more complex, the tendency to infer abstract rules increases gradually.



中文翻译:

点点滴滴的图案。规则归纳的熵模型

摘要

从有限的证据中,孩子们可以迅速地追踪他们语言的规律性,并推断出适用于新颖实例的通用规则。这项研究调查了驱动归纳飞跃的原因,从记忆特定项目和统计规律到抽象规则的提取。我们提出了一种创新的熵模型,该模型为学习输入中的规律性以及推广到新的输入提供了一个一致的信息理论解释。该模型预测,规则诱导是一种编码机制,由于大脑对输入复杂性(熵)的敏感度与人脑有限的编码能力(通道容量)相互作用而逐渐作为自然的自动反应而被驱动。在两个成年人的人工语法实验中,我们探讨了输入复杂度对规则归纳的影响。

更新日期:2019-12-11
down
wechat
bug