Enhanced monocular SLAM via complementary sensor fusion in visibility varying environment다양한 환경에서 상호 보완적인 센서 융합을 통한 향상된 단안 카메라 슬램

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 654
  • Download : 0
For autonomous navigation of vehicles and robots, three elements are essential: sensing, mapping and driving policy. These factors are eventually distinguished by robot action and perception technology, and simultaneous localization and mapping (SLAM) should be preceded by recognition technology for autonomous robot navigation. Various sensors are used for SLAM, while cameras and range sensors such as LiDAR are mainly utilized. However, the monocular vision-based methodologies hardly estimate the scale of the real environment accurately, and LiDAR-based methods have limitations in that they can only operate in environments with evident geometric characteristics. Besides, these sensors are significantly less capable of sensing in adverse weather conditions such as snow and rain or dense fog environments such as fire scenes. In this thesis, we propose a visual SLAM using the multimodal sensor. The proposed methodology tightly combines LiDAR with an imaging sensor that provides visual observation of the scene, thus solving not only the scale ambiguity problem of monocular vision but also the geometric ambiguity problem. For an all-day visual SLAM that is invariant to light conditions, the methodology is extended to thermal-infrared cameras. Also, we propose a SLAM method that combines a thermal imaging camera and radar to overcome the dense fog environment in which LiDAR cannot operate. First, we propose a multimodal sensor-based visual slam methodology using a single camera and a 3D LiDAR. Conventional visual SLAM methodologies using monocular cameras add environmental characteristics as constraints to solving the scale problem because the scale of the actual environment is underdetermined. However, these constraints need to be modified whenever the platform and environ- ment change. Also, 3D LiDAR-based methodologies are challenging to operate in environments with poor geometric characteristics, such as long corridors or wide roads. Furthermore, finding correspondences is not straight forward for sparse range measurements of LiDAR. In this study, we proposed a technique to perform visual SLAM by tracking the range measurements provided by LiDAR on the image plane. Secondly, we extended the visual SLAM methodology using cameras and 3D LiDAR to thermal- infrared cameras and proposed a robust visual SLAM methodology to change the light conditions. Gen- erally, the visible camera fails to provide rich information at night due to an insufficient amount of light. Since thermal-infrared cameras detect infrared wavelengths, they can be used regardless of day or night. Still, distortions and noise are generated by diffraction, and image bias is caused by temporal difference. This study suggests a visual SLAM methodology that is robust against light conditioning changes regardless of day or night, utilizing the advantages of the thermal-infrared camera. Finally, the SLAM methodology using the radar and the thermal-infrared camera is presented to overcome the indoor dense fog environment with low LiDAR permeability. The radar is utilized to extract ego-motion for robot navigation. Thermal-infrared cameras also detect structural lines consisting of groups of orthogonal line segments in an indoor environment with low texture. The absolute orientation determined by the structural lines is then used for SLAM. This study utilized the measurements provided by thermal-infrared camera and radar as a factor of graph SLAM.
Advisors
Kim, Ayoungresearcher김아영researcher
Description
한국과학기술원 :건설및환경공학과,
Publisher
한국과학기술원
Issue Date
2020
Identifier
325007
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 건설및환경공학과, 2020.2,[viii, 79 p. :]

Keywords

Visual SLAM▼aLocalization▼aLiDAR▼aThermal-infrared camera▼aSLAM; 비주얼 슬램▼a위치인식▼a라이다▼a열화상카메라▼a슬램

URI
http://hdl.handle.net/10203/283617
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=908375&flag=dissertation
Appears in Collection
CE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0