当前位置: X-MOL 学术Soft Robot. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Tactile Model O: Fabrication and Testing of a 3D-Printed, Three-Fingered Tactile Robot Hand
Soft Robotics ( IF 7.9 ) Pub Date : 2021-10-13 , DOI: 10.1089/soro.2020.0019
Jasper W James 1, 2 , Alex Church 1, 2 , Luke Cramphorn 1, 2 , Nathan F Lepora 1, 2
Affiliation  

Bringing tactile sensation to robotic hands will allow for more effective grasping, along with a wide range of benefits of human-like touch. Here, we present a three-dimensional-printed, three-fingered tactile robot hand comprising an OpenHand ModelO customized to house a TacTip soft biomimetic tactile sensor in the distal phalanx of each finger. We expect that combining the grasping capabilities of this underactuated hand with sophisticated tactile sensing will result in an effective platform for robot hand research—the Tactile Model O (T-MO). The design uses three JeVois machine vision systems, with each comprising a miniature camera in the tactile fingertip with a processing module in the base of the hand. To evaluate the capabilities of the T-MO, we benchmark its grasping performance by using the Gripper Assessment Benchmark on the Yale-CMU-Berkeley object set. Tactile sensing capabilities are evaluated by performing tactile object classification on 26 objects and predicting whether a grasp will successfully lift each object. Results are consistent with the state of the art, taking advantage of advances in deep learning applied to tactile image outputs. Overall, this work demonstrates that the T-MO is an effective platform for robot hand research and we expect it to open up a range of applications in autonomous object handling.

中文翻译:

触觉模型 O:3D 打印的三指触觉机器人手的制造和测试

为机器人手带来触觉将允许更有效的抓握,以及类人触摸的广泛好处。在这里,我们展示了一个 3D 打印的三指触觉机器人手,它包含一个 OpenHand ModelO,该模型定制为在每个手指的远端指骨中容纳一个 TacTip 软仿生触觉传感器。我们预计,将这种欠驱动手的抓握能力与复杂的触觉传感相结合,将为机器人手部研究提供一个有效的平台——触觉模型 O (T-MO)。该设计使用了三个 JeVois 机器视觉系统,每个系统都包含一个位于触觉指尖的微型摄像头和一个位于手根的处理模块。为了评估 T-MO 的能力,我们通过在 Yale-CMU-Berkeley 对象集上使用 Gripper Assessment Benchmark 对其抓取性能进行基准测试。通过对 26 个物体进行触觉物体分类并预测抓握是否会成功抬起每个物体来评估触觉感应能力。结果与现有技术一致,利用了应用于触觉图像输出的深度学习的进步。总体而言,这项工作表明 T-MO 是机器人手研究的有效平台,我们希望它在自主物体处理方面开辟一系列应用。利用应用于触觉图像输出的深度学习的进步。总体而言,这项工作表明 T-MO 是机器人手研究的有效平台,我们希望它在自主物体处理方面开辟一系列应用。利用应用于触觉图像输出的深度学习的进步。总体而言,这项工作表明 T-MO 是机器人手研究的有效平台,我们希望它在自主物体处理方面开辟一系列应用。
更新日期:2021-10-19
down
wechat
bug