Bird's eye view localization of surrounding vehicles: Longitudinal and lateral distance estimation with partial appearance

Cited 3 time in webofscience Cited 3 time in scopus
  • Hit : 807
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Elijah S.ko
dc.contributor.authorChoi, Wongunko
dc.contributor.authorKum, Dongsukko
dc.date.accessioned2019-03-19T01:05:09Z-
dc.date.available2019-03-19T01:05:09Z-
dc.date.created2019-02-18-
dc.date.issued2019-02-
dc.identifier.citationROBOTICS AND AUTONOMOUS SYSTEMS, v.112, pp.178 - 189-
dc.identifier.issn0921-8890-
dc.identifier.urihttp://hdl.handle.net/10203/251501-
dc.description.abstractOn-road vehicle detection is essential for perceiving driving settings, and localizing the detected vehicle helps drivers predict possible risks and avoid collisions. However, there are limited works on vehicle detection with partial appearance, and the method for partially visible vehicle localization has not been explored. In this paper, a novel framework for vehicle detection and localization with partial appearance is proposed using stereo vision and geometry. First, the original images from the stereo camera are processed to form a v-disparity map. After object detection using v-disparity, vehicle candidates are generated with prior knowledge of possible vehicle locations on the image. Deep learning-based verification completes vehicle detection. For each detected vehicle, partially visible vehicle tracking algorithm is newly introduced. To track partially visible vehicles, this algorithm detects the vehicle edge on the ground, defined as the grounded edge, and then selects a reference point for Kalman filter tracking. Finally, a rectangular box is drawn on the bird's eye view to represent vehicle's longitudinal and lateral location. The proposed system successfully performs partially visible vehicle detection and tracking. For testing the localization performance, the datasets in a highway and an urban setting are used and provide less than 1.5 m longitudinal error and 0.4 m lateral error in standard deviation. (C) 2018 Elsevier B.V. All rights reserved.-
dc.languageEnglish-
dc.publisherELSEVIER SCIENCE BV-
dc.titleBird's eye view localization of surrounding vehicles: Longitudinal and lateral distance estimation with partial appearance-
dc.typeArticle-
dc.identifier.wosid000457667300015-
dc.identifier.scopusid2-s2.0-85058032463-
dc.type.rimsART-
dc.citation.volume112-
dc.citation.beginningpage178-
dc.citation.endingpage189-
dc.citation.publicationnameROBOTICS AND AUTONOMOUS SYSTEMS-
dc.identifier.doi10.1016/j.robot.2018.11.008-
dc.contributor.localauthorKum, Dongsuk-
dc.contributor.nonIdAuthorLee, Elijah S.-
dc.contributor.nonIdAuthorChoi, Wongun-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorDistance estimation-
dc.subject.keywordAuthorPartial appearance-
dc.subject.keywordAuthorVehicle detection-
dc.subject.keywordAuthorGrounded edge and reference point-
dc.subject.keywordAuthorBird&apos-
dc.subject.keywordAuthors eye view localization-
dc.subject.keywordPlusROAD-
Appears in Collection
GT-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 3 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0