当前位置: X-MOL 学术Biol. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An insect-inspired model for acquiring views for homing.
Biological Cybernetics ( IF 1.9 ) Pub Date : 2019-05-10 , DOI: 10.1007/s00422-019-00800-1
Patrick Schulte 1 , Jochen Zeil 2 , Wolfgang Stürzl 1
Affiliation  

Wasps and bees perform learning flights when leaving their nest or food locations for the first time during which they acquire visual information that enables them to return successfully. Here we present and test a set of simple control rules underlying the execution of learning flights that closely mimic those performed by ground-nesting wasps. In the simplest model, we assume that the angle between flight direction and the nest direction as seen from the position of the insect is constant and only flips sign when pivoting direction around the nest is changed, resulting in a concatenation of piecewise defined logarithmic spirals. We then added characteristic properties of real learning flights, such as head saccades and the condition that the nest entrance within the visual field is kept nearly constant to describe the development of a learning flight in a head-centered frame of reference, assuming that the retinal position of the nest is known. We finally implemented a closed-loop simulation of learning flights based on a small set of visual control rules. The visual input for this model are rendered views generated from 3D reconstructions of natural wasp nesting sites, and the retinal nest position is controlled by means of simple template-based tracking. We show that naturalistic paths can be generated without knowledge of the absolute distance to the nest or of the flight speed. We demonstrate in addition that nest-tagged views recorded during such simulated learning flights are sufficient for a homing agent to pinpoint the goal, by identifying nest direction when encountering familiar views. We discuss how the information acquired during learning flights close to the nest can be integrated with long-range homing models.

中文翻译:

一种昆虫启发的模型,用于获取归位的视图。

黄蜂和蜜蜂首次离开巢穴或食物位置时会进行学习飞行,在此期间,它们会获取可使其成功返回的视觉信息。在这里,我们介绍并测试了一组简单的控制规则,这些规则以学习飞行的执行为基础,这些学习飞行紧密模拟了地面嵌套黄蜂所执行的学习飞行。在最简单的模型中,我们假设从昆虫的位置看,飞行方向与巢方向之间的角度是恒定的,并且当围绕巢的枢转方向发生更改时,只有翻转符号,从而导致分段定义的对数螺旋的串联。然后,我们添加了实际学习航班的特征,例如头部扫视,并且假设已知巢穴的视网膜位置,则在视野内巢穴入口保持几乎恒定以描述学习飞行在以头为中心的参照系中的发展的条件。我们最终基于一小组视觉控制规则对学习航班进行了闭环仿真。该模型的视觉输入是从自然黄蜂嵌套站点的3D重建生成的渲染视图,而视网膜巢的位置是通过基于模板的简单跟踪来控制的。我们证明自然路径可以在不知道到巢的绝对距离或飞行速度的情况下生成。此外,我们还演示了在此类模拟学习飞行过程中记录的带有嵌套标签的视图足以使归位代理准确定位目标,通过在遇到熟悉的视图时识别嵌套方向。我们讨论了在靠近巢的学习飞行中获取的信息如何与远程寻的模型集成在一起。
更新日期:2019-11-01
down
wechat
bug