File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1177/02783649241227968
- Scopus: eid_2-s2.0-85183887891
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: MARS-LVIG dataset: A multi-sensor aerial robots SLAM dataset for LiDAR-visual-inertial-GNSS fusion
| Title | MARS-LVIG dataset: A multi-sensor aerial robots SLAM dataset for LiDAR-visual-inertial-GNSS fusion |
|---|---|
| Authors | |
| Keywords | aerial robots camera Dataset Global Navigation Satellite System Inertial Measurement Unit LiDAR multi-sensor fusion Simultaneous Localization and Mapping |
| Issue Date | 2024 |
| Citation | International Journal of Robotics Research, 2024, v. 43, n. 8, p. 1114-1127 How to Cite? |
| Abstract | In recent years, advancements in Light Detection and Ranging (LiDAR) technology have made 3D LiDAR sensors more compact, lightweight, and affordable. This progress has spurred interest in integrating LiDAR with sensors such as Inertial Measurement Units (IMUs) and cameras for Simultaneous Localization and Mapping (SLAM) research. Public datasets covering different scenarios, platforms, and viewpoints are crucial for multi-sensor fusion SLAM studies, yet most focus on handheld or vehicle-mounted devices with front or 360-degree views. Data from aerial vehicles with downward-looking views is scarce, existing relevant datasets usually feature low altitudes and are mostly limited to small campus environments. To fill this gap, we introduce the Multi-sensor Aerial Robots SLAM dataset (MARS-LVIG dataset), providing unique aerial downward-looking LiDAR-Visual-Inertial-GNSS data with viewpoints from altitudes between 80 m and 130 m. The dataset not only offers new aspects to test and evaluate existing SLAM algorithms, but also brings new challenges which can facilitate researches and developments of more advanced SLAM algorithms. The MARS-LVIG dataset contains 21 sequences, acquired across diversified large-area environments including an aero-model airfield, an island, a rural town, and a valley. Within these sequences, the UAV has speeds varying from 3 m/s to 12 m/s, a scanning area reaching up to 577,000 m2, and the max path length of 7.148 km in a single flight. This dataset encapsulates data collected by a lightweight, hardware-synchronized sensor package that includes a solid-state 3D LiDAR, a global-shutter RGB camera, IMUs, and a raw message receiver of the Global Navigation Satellite System (GNSS). For algorithm evaluation, this dataset releases ground truth of both localization and mapping, which are acquired by on-board Real-time Kinematic (RTK) and DJI L1 (post-processed by its supporting software DJI Terra), respectively. The dataset can be downloaded from: https://mars.hku.hk/dataset.html. |
| Persistent Identifier | http://hdl.handle.net/10722/367886 |
| ISSN | 2023 Impact Factor: 7.5 2023 SCImago Journal Rankings: 4.346 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Li, Haotian | - |
| dc.contributor.author | Zou, Yuying | - |
| dc.contributor.author | Chen, Nan | - |
| dc.contributor.author | Lin, Jiarong | - |
| dc.contributor.author | Liu, Xiyuan | - |
| dc.contributor.author | Xu, Wei | - |
| dc.contributor.author | Zheng, Chunran | - |
| dc.contributor.author | Li, Rundong | - |
| dc.contributor.author | He, Dongjiao | - |
| dc.contributor.author | Kong, Fanze | - |
| dc.contributor.author | Cai, Yixi | - |
| dc.contributor.author | Liu, Zheng | - |
| dc.contributor.author | Zhou, Shunbo | - |
| dc.contributor.author | Xue, Kaiwen | - |
| dc.contributor.author | Zhang, Fu | - |
| dc.date.accessioned | 2025-12-19T08:00:10Z | - |
| dc.date.available | 2025-12-19T08:00:10Z | - |
| dc.date.issued | 2024 | - |
| dc.identifier.citation | International Journal of Robotics Research, 2024, v. 43, n. 8, p. 1114-1127 | - |
| dc.identifier.issn | 0278-3649 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/367886 | - |
| dc.description.abstract | In recent years, advancements in Light Detection and Ranging (LiDAR) technology have made 3D LiDAR sensors more compact, lightweight, and affordable. This progress has spurred interest in integrating LiDAR with sensors such as Inertial Measurement Units (IMUs) and cameras for Simultaneous Localization and Mapping (SLAM) research. Public datasets covering different scenarios, platforms, and viewpoints are crucial for multi-sensor fusion SLAM studies, yet most focus on handheld or vehicle-mounted devices with front or 360-degree views. Data from aerial vehicles with downward-looking views is scarce, existing relevant datasets usually feature low altitudes and are mostly limited to small campus environments. To fill this gap, we introduce the Multi-sensor Aerial Robots SLAM dataset (MARS-LVIG dataset), providing unique aerial downward-looking LiDAR-Visual-Inertial-GNSS data with viewpoints from altitudes between 80 m and 130 m. The dataset not only offers new aspects to test and evaluate existing SLAM algorithms, but also brings new challenges which can facilitate researches and developments of more advanced SLAM algorithms. The MARS-LVIG dataset contains 21 sequences, acquired across diversified large-area environments including an aero-model airfield, an island, a rural town, and a valley. Within these sequences, the UAV has speeds varying from 3 m/s to 12 m/s, a scanning area reaching up to 577,000 m<sup>2</sup>, and the max path length of 7.148 km in a single flight. This dataset encapsulates data collected by a lightweight, hardware-synchronized sensor package that includes a solid-state 3D LiDAR, a global-shutter RGB camera, IMUs, and a raw message receiver of the Global Navigation Satellite System (GNSS). For algorithm evaluation, this dataset releases ground truth of both localization and mapping, which are acquired by on-board Real-time Kinematic (RTK) and DJI L1 (post-processed by its supporting software DJI Terra), respectively. The dataset can be downloaded from: https://mars.hku.hk/dataset.html. | - |
| dc.language | eng | - |
| dc.relation.ispartof | International Journal of Robotics Research | - |
| dc.subject | aerial robots | - |
| dc.subject | camera | - |
| dc.subject | Dataset | - |
| dc.subject | Global Navigation Satellite System | - |
| dc.subject | Inertial Measurement Unit | - |
| dc.subject | LiDAR | - |
| dc.subject | multi-sensor fusion | - |
| dc.subject | Simultaneous Localization and Mapping | - |
| dc.title | MARS-LVIG dataset: A multi-sensor aerial robots SLAM dataset for LiDAR-visual-inertial-GNSS fusion | - |
| dc.type | Article | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1177/02783649241227968 | - |
| dc.identifier.scopus | eid_2-s2.0-85183887891 | - |
| dc.identifier.volume | 43 | - |
| dc.identifier.issue | 8 | - |
| dc.identifier.spage | 1114 | - |
| dc.identifier.epage | 1127 | - |
| dc.identifier.eissn | 1741-3176 | - |
