当前位置: X-MOL 学术Nat. Neurosci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Natural language instructions induce compositional generalization in networks of neurons
Nature Neuroscience ( IF 25.0 ) Pub Date : 2024-03-18 , DOI: 10.1038/s41593-024-01607-5
Reidar Riveland , Alexandre Pouget

A fundamental human cognitive feat is to interpret linguistic instructions in order to perform novel tasks without explicit task experience. Yet, the neural computations that might be used to accomplish this remain poorly understood. We use advances in natural language processing to create a neural model of generalization based on linguistic instructions. Models are trained on a set of common psychophysical tasks, and receive instructions embedded by a pretrained language model. Our best models can perform a previously unseen task with an average performance of 83% correct based solely on linguistic instructions (that is, zero-shot learning). We found that language scaffolds sensorimotor representations such that activity for interrelated tasks shares a common geometry with the semantic representations of instructions, allowing language to cue the proper composition of practiced skills in unseen settings. We show how this model generates a linguistic description of a novel task it has identified using only motor feedback, which can subsequently guide a partner model to perform the task. Our models offer several experimentally testable predictions outlining how linguistic information must be represented to facilitate flexible and general cognition in the human brain.



中文翻译:

自然语言指令诱导神经元网络中的成分泛化

人类的一项基本认知壮举是解释语言指令,以便在没有明确任务经验的情况下执行新任务。然而,可用于实现这一目标的神经计算仍然知之甚少。我们利用自然语言处理的进步来创建基于语言指令的泛化神经模型。模型接受一组常见心理物理任务的训练,并接收预训练语言模型嵌入的指令。我们最好的模型可以执行以前未见过的任务,仅基于语言指令(即零样本学习),平均正确率可达 83%。我们发现,语言支撑着感觉运动表征,使得相关任务的活动与指令的语义表征共享共同的几何形状,从而使语言能够提示在看不见的环境中练习技能的正确组成。我们展示了该模型如何仅使用运动反馈来生成对其识别的新任务的语言描述,这随后可以指导合作伙伴模型执行该任务。我们的模型提供了几个可通过实验测试的预测,概述了必须如何表示语言信息以促进人脑的灵活和一般认知。

更新日期:2024-03-18
down
wechat
bug