当前位置: X-MOL 学术arXiv.cs.RO › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Super Odometry: IMU-centric LiDAR-Visual-Inertial Estimator for Challenging Environments
arXiv - CS - Robotics Pub Date : 2021-04-30 , DOI: arxiv-2104.14938
Shibo Zhao, Hengrui Zhang, Peng Wang, Lucas Nogueira, Sebastian Scherer

We propose Super Odometry, a high-precision multi-modal sensor fusion framework, providing a simple but effective way to fuse multiple sensors such as LiDAR, camera, and IMU sensors and achieve robust state estimation in perceptually-degraded environments. Different from traditional sensor-fusion methods, Super Odometry employs an IMU-centric data processing pipeline, which combines the advantages of loosely coupled methods with tightly coupled methods and recovers motion in a coarse-to-fine manner. The proposed framework is composed of three parts: IMU odometry, visual-inertial odometry, and laser-inertial odometry. The visual-inertial odometry and laser-inertial odometry provide the pose prior to constrain the IMU bias and receive the motion prediction from IMU odometry. To ensure high performance in real-time, we apply a dynamic octree that only consumes 10 % of the running time compared with a static KD-tree. The proposed system was deployed on drones and ground robots, as part of Team Explorer's effort to the DARPA Subterranean Challenge where the team won $1^{st}$ and $2^{nd}$ place in the Tunnel and Urban Circuits, respectively.

中文翻译:

超级Odometry:以IMU为中心的LiDAR视觉惯性估算器,用于具有挑战性的环境

我们提出了超级Odometry,这是一种高精度的多模式传感器融合框架,它提供了一种简单而有效的方法来融合多个传感器,例如LiDAR,相机和IMU传感器,并在感知退化的环境中实现可靠的状态估计。与传统的传感器融合方法不同,Super Odometry采用以IMU为中心的数据处理流水线,该流水线结合了松散耦合方法和紧密耦合方法的优点,并从粗到精的方式恢复了运动。拟议的框架由三部分组成:IMU里程表,视觉惯性里程表和激光惯性里程表。视觉惯性里程表和激光惯性里程表在约束IMU偏向并接收来自IMU里程表的运动预测之前提供姿势。为了确保实时的高性能,与静态KD树相比,我们应用的动态八叉树仅消耗10%的运行时间。拟议的系统被部署在无人机和地面机器人上,作为Team Explorer参加DARPA地下挑战赛的一部分,该团队分别在隧道和城市赛道赢得了$ 1 ^ {st} $和$ 2 ^ {nd} $的奖金。
更新日期:2021-05-03
down
wechat
bug