当前位置: X-MOL 学术arXiv.cs.RO › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Event-Driven Visual-Tactile Sensing and Learning for Robots
arXiv - CS - Robotics Pub Date : 2020-09-15 , DOI: arxiv-2009.07083
Tasbolat Taunyazov, Weicong Sng, Hian Hian See, Brian Lim, Jethro Kuan, Abdul Fatir Ansari, Benjamin C.K. Tee, and Harold Soh

This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems.

中文翻译:

事件驱动的机器人视觉触觉感知和学习

这项工作贡献了一个事件驱动的视觉触觉感知系统,包括一个新颖的生物启发触觉传感器和基于多模态尖峰的学习。由于其基于事件的性质,我们的神经形态指尖触觉传感器 NeuTouch 可以很好地适应紫杉醇的数量。同样,我们的视觉触觉尖峰神经网络 (VT-SNN) 与事件传感器结合使用时可以实现快速感知。我们在两个机器人任务上评估我们的视觉触觉系统(使用 NeuTouch 和 Prophesee 事件相机):容器分类和旋转滑动检测。在这两项任务中,我们观察到相对于标准深度学习方法的良好准确性。我们免费提供视觉触觉数据集,以鼓励对多模态事件驱动机器人感知的研究,
更新日期:2020-09-16
down
wechat
bug