Low-Level Sensor Fusion for 3D Vehicle Detection Using Radar Range-Azimuth Heatmap and Monocular Image

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 205
  • Download : 0
Robust and accurate object detection on roads with various objects is essential for automated driving. The radar has been employed in commercial advanced driver assistance systems (ADAS) for a decade due to its low-cost and high-reliability advantages. However, the radar has been used only in limited driving conditions such as highways to detect a few forwarding vehicles because of the limited performance of radar due to low resolution or poor classification. We propose a learning-based detection network using radar range-azimuth heatmap and monocular image in order to fully exploit the radar in complex road environments. We show that radar-image fusion can overcome the inherent weakness of the radar by leveraging camera information. Our proposed network has a two-stage architecture that combines radar and image feature representations rather than fusing each sensor’s prediction results to improve detection performance over a single sensor. To demonstrate the effectiveness of the proposed method, we collected radar, camera, and LiDAR data in various driving environments in terms of vehicle speed, lighting conditions, and traffic volume. Experimental results show that the proposed fusion method outperforms the radar-only and the image-only method. © 2021, Springer Nature Switzerland AG.
Publisher
Springer Science and Business Media Deutschland GmbH
Issue Date
2020-11-30
Language
English
Citation

15th Asian Conference on Computer Vision, ACCV 2020, pp.388 - 402

ISSN
0302-9743
DOI
10.1007/978-3-030-69535-4_24
URI
http://hdl.handle.net/10203/288409
Appears in Collection
GT-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0