当前位置: X-MOL 学术Ind. Rob. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Imitation learning of a wheeled mobile manipulator based on dynamical movement primitives
Industrial Robot ( IF 1.8 ) Pub Date : 2021-06-17 , DOI: 10.1108/ir-11-2020-0255
Zeguo Yang , Mantian Li , Fusheng Zha , Xin Wang , Pengfei Wang , Wei Guo

Purpose

This paper aims to introduce an imitation learning framework for a wheeled mobile manipulator based on dynamical movement primitives (DMPs). A novel mobile manipulator with the capability to learn from demonstration is introduced. Then, this study explains the whole process for a wheeled mobile manipulator to learn a demonstrated task and generalize to new situations. Two visual tracking controllers are designed for recording human demonstrations and monitoring robot operations. The study clarifies how human demonstrations can be learned and generalized to new situations by a wheel mobile manipulator.

Design/methodology/approach

The kinematic model of a mobile manipulator is analyzed. An RGB-D camera is applied to record the demonstration trajectories and observe robot operations. To avoid human demonstration behaviors going out of sight of the camera, a visual tracking controller is designed based on the kinematic model of the mobile manipulator. The demonstration trajectories are then represented by DMPs and learned by the mobile manipulator with corresponding models. Another tracking controller is designed based on the kinematic model of the mobile manipulator to monitor and modify the robot operations.

Findings

To verify the effectiveness of the imitation learning framework, several daily tasks are demonstrated and learned by the mobile manipulator. The results indicate that the presented approach shows good performance for a wheeled mobile manipulator to learn tasks through human demonstrations. The only thing a robot-user needs to do is to provide demonstrations, which highly facilitates the application of mobile manipulators.

Originality/value

The research fulfills the need for a wheeled mobile manipulator to learn tasks via demonstrations instead of manual planning. Similar approaches can be applied to mobile manipulators with different architecture.



中文翻译:

基于动态运动原语的轮式移动机械手模仿学习

目的

本文旨在介绍一种基于动态运动原语(DMP)的轮式移动机械手的模仿学习框架。介绍了一种具有从演示中学习能力的新型移动机械手。然后,本研究解释了轮式移动机械手学习演示任务并推广到新情况的整个过程。两个视觉跟踪控制器设计用于记录人类演示和监控机器人操作。该研究阐明了如何通过轮式移动机械手学习人类演示并将其推广到新情况。

设计/方法/方法

分析了移动机械手的运动学模型。RGB-D 摄像头用于记录演示轨迹并观察机器人操作。为避免人类的示范行为脱离摄像机的视线,基于移动机械手的运动学模型设计了视觉跟踪控制器。然后,演示轨迹由 DMP 表示,并由具有相应模型的移动机械手学习。另一个跟踪控制器是基于移动机械手的运动学模型设计的,用于监控和修改机器人的操作。

发现

为了验证模仿学习框架的有效性,移动机械手演示和学习了几个日常任务。结果表明,所提出的方法显示了轮式移动机械手通过人类演示学习任务的良好性能。机器人用户唯一需要做的就是提供演示,这极大地方便了移动机械手的应用。

原创性/价值

该研究满足了轮式移动机械手通过演示而不是手动规划来学习任务的需要。类似的方法可以应用于具有不同架构的移动机械手。

更新日期:2021-08-19
down
wechat
bug