当前位置: X-MOL 学术Int. J. Mach. Learn. & Cyber. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Cross-modal learning for material perception using deep extreme learning machine
International Journal of Machine Learning and Cybernetics ( IF 3.1 ) Pub Date : 2019-05-16 , DOI: 10.1007/s13042-019-00962-1
Wendong Zheng , Huaping Liu , Bowen Wang , Fuchun Sun

The material property of an object’s surface is critical for the tasks of robotic manipulation or interaction with its surrounding environment. Tactile sensing can provide rich information about the material characteristics of an object’s surface. Hence, it is important to convey and interpret tactile information of material properties to the users during interaction. In this paper, we propose a visual-tactile cross-modal retrieval framework to convey tactile information of surface material for perceptual estimation. In particular, we use tactile information of a new unknown surface material to retrieve perceptually similar surface from an available surface visual sample set. For the proposed framework, we develop a deep cross-modal correlation learning method, which incorporates the high-level nonlinear representation of deep extreme learning machine and class-paired correlation learning of cluster canonical correlation analysis. Experimental results on the publicly available dataset validate the effectiveness of the proposed framework and the method.

中文翻译:

使用深度极限学习机进行跨模态学习以实现物质感知

对象表面的材料属性对于机器人操纵或与其周围环境的交互作用至关重要。触觉可以提供有关物体表面材料特征的丰富信息。因此,在交互过程中向用户传达和解释材料特性的触觉信息很重要。在本文中,我们提出了一种视觉-触觉交叉模式检索框架,以传达用于感知估计的表面材料的触觉信息。特别是,我们使用新的未知表面材料的触觉信息从可用的表面视觉样本集中检索在感知上相似的表面。对于提出的框架,我们开发了一种深层的跨模态相关学习方法,它结合了深度极限学习机的高级非线性表示和聚类典范相关分析的类对相关学习。在公开数据集上的实验结果验证了所提出框架和方法的有效性。
更新日期:2019-05-16
down
wechat
bug