当前位置: X-MOL 学术Int. J. Robot. Res. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
The UMA-SAR Dataset: Multimodal data collection from a ground vehicle during outdoor disaster response training exercises
The International Journal of Robotics Research ( IF 7.5 ) Pub Date : 2021-04-06 , DOI: 10.1177/02783649211004959
Jesús Morales 1 , Ricardo Vázquez-Martín 1 , Anthony Mandow 1 , David Morilla-Cabello 1 , Alfonso García-Cerezo 1
Affiliation  

This article presents a collection of multimodal raw data captured from a manned all-terrain vehicle in the course of two realistic outdoor search and rescue (SAR) exercises for actual emergency responders conducted in Málaga (Spain) in 2018 and 2019: the UMA-SAR dataset. The sensor suite, applicable to unmanned ground vehicles (UGVs), consisted of overlapping visible light (RGB) and thermal infrared (TIR) forward-looking monocular cameras, a Velodyne HDL-32 three-dimensional (3D) lidar, as well as an inertial measurement unit (IMU) and two global positioning system (GPS) receivers as ground truth. Our mission was to collect a wide range of data from the SAR domain, including persons, vehicles, debris, and SAR activity on unstructured terrain. In particular, four data sequences were collected following closed-loop routes during the exercises, with a total path length of 5.2 km and a total time of 77 min. In addition, we provide three more sequences of the empty site for comparison purposes (an extra 4.9 km and 46 min). Furthermore, the data is offered both in human-readable format and as rosbag files, and two specific software tools are provided for extracting and adapting this dataset to the users’ preference. The review of previously published disaster robotics repositories indicates that this dataset can contribute to fill a gap regarding visual and thermal datasets and can serve as a research tool for cross-cutting areas such as multispectral image fusion, machine learning for scene understanding, person and object detection, and localization and mapping in unstructured environments. The full dataset is publicly available at: www.uma.es/robotics-and-mechatronics/sar-datasets.



中文翻译:

UMA-SAR数据集:在户外灾难响应训练演习中从地面车辆收集多模式数据

本文介绍了在2018年和2019年为在西班牙马拉加进行的两次实际户外搜救(SAR)演习中,从载人全地形车捕获的多式联运原始数据的集合:UMA-SAR数据集。该传感器套件适用于无人地面车辆(UGV),由重叠的可见光(RGB)和热红外(TIR)前瞻性单眼摄像机,Velodyne HDL-32三维(3D)激光雷达以及惯性测量单元(IMU)和两个全球定位系统(GPS)接收器作为地面实况。我们的任务是从SAR领域收集广泛的数据,包括人员,车辆,碎片和非结构化地形上的SAR活动。特别是,在练习过程中,按照闭环路线收集了四个数据序列,总路径长度为5.2公里,总时间为77分钟。此外,我们还提供了三个空旷地点的序列供比较(额外4.9公里和46分钟)。此外,数据以人类可读格式和rosbag文件的形式提供,并且提供了两个特定的软件工具来提取和调整此数据集以满足用户的喜好。对先前发布的灾难机器人存储库的审查表明,该数据集可以弥补视觉和热数据集的空白,并且可以用作跨领域研究工具,例如多光谱图像融合,机器学习以了解场景,人和物体非结构化环境中的检测,定位和映射。完整的数据集可从以下网址公开获得:www.uma.es/robotics-and-mechatronics/sar-datasets。

更新日期:2021-04-08
down
wechat
bug