Introduction

We present 5 datasets of legged-wheel robot containing LiDAR data, IMU data, joint sensors data and ground truth. These datasets cover different challenging scenes. To the best of our knowledge, there are limited public datasets collected from legged-wheel robots. We hope our datasets enable the development of legged-wheel robot SLAM in the community.

Our datasets are now available:

Sensor setup

The legged-wheel robot we used to collect datasets is shown in Fig. 1. It’s equipped with 2 MID360 LiDAR, a BG-610M RTK GNSS, and a Realsense d435i RGB-D camera. Note that our datasets only include the LiDAR and IMU data from 2 MID360 LiDAR, the GPS-RTK data from RTK GNSS module and joints sensors data from the robot’s API. Data from d435i is not in the datasets.

Fig. 1. Our data collection platform.
Fig. 2. The extrinsic of LiDAR2 w.r.t. LiDAR1.

Data Collection

We collect the dataset in different challenging scenes in the campus of Shanghai Jiao Tong University. During the data collection process, the average robot speed is under 1m/s. In particular:

Fig. 3. Left: staircase Scene 1 Dataset; Right: Staircase Scene 2 Dataset
Fig. 4. Artificial Hill Dataset
Fig. 5. Rose Garden Dataset
Fig. 6. Botanical Garden Dataset

License

This dataset is licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0).