Sensor Fusion by Spatial Encoding for Autonomous Driving

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 82
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLai-Dang, Quoc-Vinhko
dc.contributor.authorLee, Jihuiko
dc.contributor.authorPark, Bumgeunko
dc.contributor.authorHar, Dongsooko
dc.date.accessioned2023-12-20T07:00:11Z-
dc.date.available2023-12-20T07:00:11Z-
dc.date.created2023-11-30-
dc.date.issued2023-10-29-
dc.identifier.citationIEEE SENSORS 2023-
dc.identifier.urihttp://hdl.handle.net/10203/316719-
dc.description.abstractSensor fusion is critical to perception systems for task domains such as autonomous driving and robotics. Recently, the Transformer integrated with CNN has demonstrated high performance in sensor fusion for various perception tasks. In this work, we introduce a method for fusing data from camera and LiDAR. By employing Transformer modules at multiple resolutions, proposed method effectively combines local and global contextual relationships. The performance of the proposed method is validated by extensive experiments with two adversarial benchmarks with lengthy routes and high-density traffics. The proposed method outperforms previous approaches with the most challenging benchmarks, achieving significantly higher driving and infraction scores. Compared with TransFuser, it achieves 8% and 19% improvement in driving scores for the Longest6 and Town05 Long benchmarks, respectively.-
dc.languageEnglish-
dc.publisherIEEE Sensors Council-
dc.titleSensor Fusion by Spatial Encoding for Autonomous Driving-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationnameIEEE SENSORS 2023-
dc.identifier.conferencecountryAU-
dc.identifier.conferencelocationVienna-
dc.contributor.localauthorHar, Dongsoo-
dc.contributor.nonIdAuthorLai-Dang, Quoc-Vinh-
dc.contributor.nonIdAuthorLee, Jihui-
dc.contributor.nonIdAuthorPark, Bumgeun-
Appears in Collection
GT-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0