Autonomous Robots ( IF 3.5 ) Pub Date : 2021-10-28 , DOI: 10.1007/s10514-021-10019-4 Stéphane Magnenat 1 , Francis Colas 2
Programming robots often involves expert knowledge in both the robot itself and the task to execute. An alternative to direct programming is for a human to show examples of the task execution and have the robot perform the task based on these examples, in a scheme known as learning or programming from demonstration. We propose and study a generic and simple learning-from-demonstration framework. Our approach is to combine the demonstrated commands according to the similarity between the demonstrated sensory trajectories and the current replay trajectory. This tracking is solely performed based on sensor values and time and completely dispenses with the usually expensive step of precomputing an internal model of the task. We analyse the behaviour of the proposed model in several simulated conditions and test it on two different robotic platforms. We show that it can reproduce different capabilities with a limited number of meta parameters.
中文翻译:
用于从演示中合成移动机器人行为的贝叶斯跟踪器
机器人编程通常涉及机器人本身和要执行的任务的专业知识。直接编程的替代方法是让人类展示任务执行的示例,并让机器人根据这些示例执行任务,这种方案称为从演示中学习或编程。我们提出并研究了一个通用且简单的从演示中学习的框架。我们的方法是根据演示的感觉轨迹和当前回放轨迹之间的相似性来组合演示的命令。这种跟踪完全基于传感器值和时间执行,完全省去了预先计算任务内部模型的通常昂贵的步骤。我们分析了所提出模型在几种模拟条件下的行为,并在两个不同的机器人平台上对其进行了测试。