Multi-unmanned Aerial Vehicle Pose Estimation Based on Visual-inertial-range Sensor Fusion영상-관성-거리 센서 융합 기반의 다중 무인기 위치 추정

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 291
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorChoi, Junhoko
dc.contributor.authorMarsim, Kevin Christiansenko
dc.contributor.authorJeong, Myeongwooko
dc.contributor.authorRyoo, Kihwanko
dc.contributor.authorKim, Jeewonko
dc.contributor.authorMyung, Hyunko
dc.date.accessioned2023-11-22T01:00:17Z-
dc.date.available2023-11-22T01:00:17Z-
dc.date.created2023-11-22-
dc.date.created2023-11-22-
dc.date.issued2023-11-
dc.identifier.citationJournal of Institute of Control, Robotics and Systems, v.29, no.11, pp.859 - 865-
dc.identifier.issn1976-5622-
dc.identifier.urihttp://hdl.handle.net/10203/315014-
dc.description.abstractMulti-robot state estimation is crucial for real-time and accurate operation, especially in complex environments where a global navigation satellite system cannot be used. Many researchers employ multiple sensor modalities, including cameras, LiDAR, and ultra-wideband (UWB), to achieve real-time state estimation. However, each sensor has specific requirements that might limit its usage. While LiDAR sensors demand a high payload capacity, camera sensors must have matching image features between robots, and UWB sensors require known fixed anchor locations for accurate positioning. This study introduces a robust localization system with a minimal sensor setup that eliminates the need for the previously mentioned requirements. We used an anchor-free UWB setup to establish a global coordinate system, unifying all robots. Each robot performs visual-inertial odometry to estimate its ego-motion in its local coordinate system. By optimizing the local odometry from each robot using inter-robot range measurements, the positions of the robots can be robustly estimated without relying on an extensive sensor setup or infrastructure. Our method offers a simple yet effective solution for achieving accurate and real-time multi-robot state estimation in challenging environments without relying on traditional sensor requirements.-
dc.languageEnglish-
dc.publisherInstitute of Control, Robotics and Systems-
dc.titleMulti-unmanned Aerial Vehicle Pose Estimation Based on Visual-inertial-range Sensor Fusion-
dc.title.alternative영상-관성-거리 센서 융합 기반의 다중 무인기 위치 추정-
dc.typeArticle-
dc.identifier.scopusid2-s2.0-85175860572-
dc.type.rimsART-
dc.citation.volume29-
dc.citation.issue11-
dc.citation.beginningpage859-
dc.citation.endingpage865-
dc.citation.publicationnameJournal of Institute of Control, Robotics and Systems-
dc.identifier.doi10.5302/j.icros.2023.23.0135-
dc.identifier.kciidART003013342-
dc.contributor.localauthorMyung, Hyun-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthormulti-robot localization-
dc.subject.keywordAuthorrelative pose estimation-
dc.subject.keywordAuthorsensor fusion-
dc.subject.keywordAuthorvisual-inertial-range odometry-
dc.subject.keywordAuthorswarm robots-
dc.subject.keywordAuthor.-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0