MulRan: Multimodal Range Dataset for Urban Place Recognition

Cited 122 time in webofscience Cited 54 time in scopus
  • Hit : 320
  • Download : 0
This paper introduces a multimodal range dataset namely for radio detection and ranging (radar) and light detection and ranging (LiDAR) specifically targeting the urban environment. By extending our workshop paper [1] to a larger scale, this dataset focuses on the range sensor-based place recognition and provides 6D baseline trajectories of a vehicle for place recognition ground truth. Provided radar data support both raw-level and image-format data, including a set of time-stamped 1D intensity arrays and 360° polar images, respectively. In doing so, we provide flexibility between raw data and image data depending on the purpose of the research. Unlike existing datasets, our focus is at capturing both temporal and structural diversities for range-based place recognition research. For evaluation, we applied and validated that our previous location descriptor and its search algorithm [2] are highly effective for radar place recognition method. Furthermore, the result shows that radar-based place recognition outperforms LiDAR-based one exploiting its longer-range measurements. The dataset is available from https://sites.google.com/view/mulran-pr.
Publisher
Institute of Electrical and Electronics Engineers Inc.
Issue Date
2020-05-31
Language
English
Citation

IEEE International Conference on Robotics and Automation (ICRA), pp.6246 - 6253

ISSN
1050-4729
DOI
10.1109/ICRA40945.2020.9197298
URI
http://hdl.handle.net/10203/278859
Appears in Collection
CE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 122 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0