DC Field | Value | Language |
---|---|---|
dc.contributor.author | Choi, Junho | ko |
dc.contributor.author | Marsim, Kevin Christiansen | ko |
dc.contributor.author | Jeong, Myeongwoo | ko |
dc.contributor.author | Ryoo, Kihwan | ko |
dc.contributor.author | Kim, Jeewon | ko |
dc.contributor.author | Myung, Hyun | ko |
dc.date.accessioned | 2023-11-22T01:00:17Z | - |
dc.date.available | 2023-11-22T01:00:17Z | - |
dc.date.created | 2023-11-22 | - |
dc.date.created | 2023-11-22 | - |
dc.date.issued | 2023-11 | - |
dc.identifier.citation | Journal of Institute of Control, Robotics and Systems, v.29, no.11, pp.859 - 865 | - |
dc.identifier.issn | 1976-5622 | - |
dc.identifier.uri | http://hdl.handle.net/10203/315014 | - |
dc.description.abstract | Multi-robot state estimation is crucial for real-time and accurate operation, especially in complex environments where a global navigation satellite system cannot be used. Many researchers employ multiple sensor modalities, including cameras, LiDAR, and ultra-wideband (UWB), to achieve real-time state estimation. However, each sensor has specific requirements that might limit its usage. While LiDAR sensors demand a high payload capacity, camera sensors must have matching image features between robots, and UWB sensors require known fixed anchor locations for accurate positioning. This study introduces a robust localization system with a minimal sensor setup that eliminates the need for the previously mentioned requirements. We used an anchor-free UWB setup to establish a global coordinate system, unifying all robots. Each robot performs visual-inertial odometry to estimate its ego-motion in its local coordinate system. By optimizing the local odometry from each robot using inter-robot range measurements, the positions of the robots can be robustly estimated without relying on an extensive sensor setup or infrastructure. Our method offers a simple yet effective solution for achieving accurate and real-time multi-robot state estimation in challenging environments without relying on traditional sensor requirements. | - |
dc.language | English | - |
dc.publisher | Institute of Control, Robotics and Systems | - |
dc.title | Multi-unmanned Aerial Vehicle Pose Estimation Based on Visual-inertial-range Sensor Fusion | - |
dc.title.alternative | 영상-관성-거리 센서 융합 기반의 다중 무인기 위치 추정 | - |
dc.type | Article | - |
dc.identifier.scopusid | 2-s2.0-85175860572 | - |
dc.type.rims | ART | - |
dc.citation.volume | 29 | - |
dc.citation.issue | 11 | - |
dc.citation.beginningpage | 859 | - |
dc.citation.endingpage | 865 | - |
dc.citation.publicationname | Journal of Institute of Control, Robotics and Systems | - |
dc.identifier.doi | 10.5302/j.icros.2023.23.0135 | - |
dc.identifier.kciid | ART003013342 | - |
dc.contributor.localauthor | Myung, Hyun | - |
dc.description.isOpenAccess | N | - |
dc.type.journalArticle | Article | - |
dc.subject.keywordAuthor | multi-robot localization | - |
dc.subject.keywordAuthor | relative pose estimation | - |
dc.subject.keywordAuthor | sensor fusion | - |
dc.subject.keywordAuthor | visual-inertial-range odometry | - |
dc.subject.keywordAuthor | swarm robots | - |
dc.subject.keywordAuthor | . | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.