Confidence Analysis of Feature Points for Visual-Inertial Odometry of Urban Vehicles

Cited 2 time in webofscience Cited 3 time in scopus
  • Hit : 564
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorLee, Chang-Ryeolko
dc.contributor.authorYoon, Kuk-Jinko
dc.date.accessioned2019-07-18T05:31:29Z-
dc.date.available2019-07-18T05:31:29Z-
dc.date.created2019-03-08-
dc.date.created2019-03-08-
dc.date.issued2019-07-
dc.identifier.citationIET INTELLIGENT TRANSPORT SYSTEMS, v.13, no.7, pp.1130 - 1138-
dc.identifier.issn1751-9578-
dc.identifier.urihttp://hdl.handle.net/10203/263299-
dc.description.abstractVisual-inertial odometry (VIO) is the process of estimating ego-motion using a camera and the inertial measurement unit (IMU). It shows outstanding performance in estimating the ego-motion of a vehicle at the absolute scale thanks to the gyroscope and the accelerometer. However, VIO has some difficulties in estimating the translation in large-scale outdoor environments because the feature points along the motion direction and distant feature points in the images can cause degenerate situations. To resolve these difficulties, the authors propose to infer the confidence measures of the feature points and appropriately incorporate them into the Kalman filter-based VIO. The confidence is computed from the motion direction and the displacements of the tracked feature points under authors' urban canyon prior. It is applied in situations where the camera is moving forward towards the measurement noise covariance of the Kalman filter for ego-motion estimation. Experimental results on the public KITTI dataset show that VIO outperforms monocular and stereo-visual odometries, and the proposed VIO with confidence analysis achieves a translation error of 1.82% and a rotation error of 0.0018 deg/m.-
dc.languageEnglish-
dc.publisherINST ENGINEERING TECHNOLOGY-IET-
dc.titleConfidence Analysis of Feature Points for Visual-Inertial Odometry of Urban Vehicles-
dc.typeArticle-
dc.identifier.wosid000473239700008-
dc.identifier.scopusid2-s2.0-85068319501-
dc.type.rimsART-
dc.citation.volume13-
dc.citation.issue7-
dc.citation.beginningpage1130-
dc.citation.endingpage1138-
dc.citation.publicationnameIET INTELLIGENT TRANSPORT SYSTEMS-
dc.identifier.doi10.1049/iet-its.2018.5196-
dc.contributor.localauthorYoon, Kuk-Jin-
dc.contributor.nonIdAuthorLee, Chang-Ryeol-
dc.description.isOpenAccessY-
dc.type.journalArticleArticle-
dc.subject.keywordAuthormotion estimation-
dc.subject.keywordAuthorgyroscopes-
dc.subject.keywordAuthorinertial navigation-
dc.subject.keywordAuthormobile robots-
dc.subject.keywordAuthorcameras-
dc.subject.keywordAuthorstereo image processing-
dc.subject.keywordAuthordistance measurement-
dc.subject.keywordAuthorrobot vision-
dc.subject.keywordAuthornonlinear filters-
dc.subject.keywordAuthorKalman filters-
dc.subject.keywordAuthorconfidence analysis-
dc.subject.keywordAuthorvisual-inertial odometry-
dc.subject.keywordAuthorurban vehicles-
dc.subject.keywordAuthorinertial measurement unit-
dc.subject.keywordAuthorabsolute scale thanks-
dc.subject.keywordAuthorlarge-scale outdoor environments-
dc.subject.keywordAuthormotion direction-
dc.subject.keywordAuthordistant feature points-
dc.subject.keywordAuthorconfidence measures-
dc.subject.keywordAuthorKalman filter-based VIO-
dc.subject.keywordAuthortracked feature points-
dc.subject.keywordAuthorauthors-
dc.subject.keywordAuthormeasurement noise covariance-
dc.subject.keywordAuthorego-motion estimation-
dc.subject.keywordAuthorstereo-visual odometries-
dc.subject.keywordPlusABSOLUTE SCALE-
dc.subject.keywordPlusVISION-
dc.subject.keywordPlusFUSION-
dc.subject.keywordPlusIMU-
Appears in Collection
ME-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0