当前位置: X-MOL 学术Sci. Robot. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Wireless steerable vision for live insects and insect-scale robots
Science Robotics ( IF 25.0 ) Pub Date : 2020-07-15 , DOI: 10.1126/scirobotics.abb0839
Vikram Iyer 1, 2 , Ali Najafi 2 , Johannes James 3 , Sawyer Fuller 3 , Shyamnath Gollakota 1, 2, 3
Affiliation  

Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion. By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot. Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams “first person” 160 pixels–by–120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away. We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10–milliamp hour battery. We also built a small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes. Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.



中文翻译:

活虫和昆虫规模机器人的无线可控视觉

视觉是昆虫必不可少的感觉输入,但会消耗大量能源。支持敏感感光器的成本已导致许多昆虫仅在较小的视网膜区域内形成较高的视敏度,并逐渐通过头部运动使其视觉系统独立于其身体而移动。通过了解昆虫视觉系统在自然界中的取舍,我们可以平衡能量,计算和质量的方式为昆虫规模的机器人设计更好的视觉系统。在这里,我们报告了一种完全无线,动力自主,可机械操纵的视觉系统,该系统以足够小的形状因子模仿头部运动,可以安装在活的甲虫或类似大小的陆地机器人的背部。我们的电子设备和执行器重248毫克,可以根据智能手机的命令将相机转向60°。相机以每秒1到5帧(fps)的速度将“第一人称” 160像素×120像素的单色视频流传输到距离120米远的蓝牙无线电中。我们将此视觉系统安装在两种自由行走的活甲虫上,证明使用板载加速度计触发图像捕获,使用10毫安时电池可达到长达6个小时的运行时间。我们还建造了一个小型地面机器人(1.6厘米乘2厘米),该机器人可以以每秒3.5厘米的速度移动,支持视力,并且可以运行63至260分钟。我们的结果表明,与移动整个机器人相比,可操纵的视觉可以使对象跟踪和广角视图的能耗降低26至84倍。

更新日期:2020-07-16
down
wechat
bug