当前位置: X-MOL 学术IEEE Trans. Haptics › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Generation of Tactile Data from 3D Vision and Target Robotic Grasps
IEEE Transactions on Haptics ( IF 2.4 ) Pub Date : 2020-01-01 , DOI: 10.1109/toh.2020.3011899
Brayan S. Zapata-Impata , Pablo Gil , Youcef Mezouar , Fernando Torres

Tactile perception is a rich source of information for robotic grasping: it allows a robot to identify a grasped object and assess the stability of a grasp, among other things. However, the tactile sensor must come into contact with the target object in order to produce readings. As a result, tactile data can only be attained if a real contact is made. We propose to overcome this restriction by employing a method that models the behaviour of a tactile sensor using 3D vision and grasp information as a stimulus. Our system regresses the quantified tactile response that would be experienced if this grasp were performed on the object. We experiment with 16 items and 4 tactile data modalities to show that our proposal learns this task with low error.

中文翻译:

从 3D 视觉和目标机器人抓取中生成触觉数据

触觉感知是机器人抓取的丰富信息来源:它允许机器人识别抓取的物体并评估抓取的稳定性等。然而,触觉传感器必须与目标物体接触才能产生读数。因此,只有在进行真正的接触时才能获得触觉数据。我们建议通过采用一种方法来克服这一限制,该方法使用 3D 视觉对触觉传感器的行为进行建模,并将信息作为刺激。我们的系统回归量化的触觉响应,如果对物体进行这种抓取,将会体验到这种感觉。我们用 16 个项目和 4 种触觉数据模式进行实验,以表明我们的建议以低错误率学习这项任务。
更新日期:2020-01-01
down
wechat
bug