Vision and LIDAR-based autonomous navigation system for indoor and outdoor flight of unmanned aerial vehicles무인항공기의 실내외 비행을 위한 영상 및 LIDAR 기반의 자율항법시스템 연구

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 566
  • Download : 0
The Global Positioning System (GPS) is widely used to aid the navigation of aerial vehicles. However, the GPS cannot be used indoors. Further, indoor environments are sometimes cluttered and there are many obstacles that can be dangerous if collisions occur. Therefore, alternative navigation methods need to be developed for unmanned aerial vehicles (UAVs) flying in GPS-denied environments. An autonomous navigation system is required to provide a six-degree-of-freedom (6-DOF) pose and a three-dimensional (3-D) map. In this dissertation, a vision and scanning Light Detection and Ranging (LIDAR)-based autonomous navigation system for indoor and outdoor flight of UAVs is proposed. First, an integrated navigation sensor module that includes a camera, a LIDAR, and an inertial measurement unit (IMU) was developed to enable UAVs to fly both indoors and outdoors. To calibrate the camera and gimbaled LIDAR, a new method is proposed that uses a simple visual marker. The camera and the gimbaled LIDAR work in a complementary manner to extract feature points and to merge them with the LIDAR range for state estimation. The features are processed using an online Extended Kalman Filter-Simultaneous Localization and Mapping (EKF-SLAM) algorithm to estimate the navigational states of the vehicle. These sensors and the EKF-SLAM algorithms were implemented on an octo-rotor UAV platform and were tested. The results show that the navigation module can provide a real-time 3-D navigation solution without the need for prior information on the surroundings. Second, a real-time 3-D indoor navigation system and closed-loop control of a quad-rotor aerial vehicle equipped with an IMU and a low-cost LIDAR is presented. In order to estimate the pose of the vehicle equipped with the LIDAR, an octree-based grid map and Monte Carlo Localization (MCL) are adopted. The navigation results using the MCL are then evaluated by making a comparison with a motion capture system. Finally, the results are ...
Advisors
Shim, Hyun-Chulresearcher심현철
Description
한국과학기술원 : 항공우주공학전공,
Publisher
한국과학기술원
Issue Date
2014
Identifier
591870/325007  / 020095180
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 항공우주공학전공, 2014.8, [ vii, 65p ]

Keywords

Autonomous Navigation System; 무인항공기; Monte Carlo Localization; EKF-SLAM; 영상-LIDAR 센서 융합; 3차원 실내항법; Indoor 3-D Navigation; Vision-LIDAR Sensor Fusion; EKF-SLAM; Monte Carlo Localization; Unmanned Aerial Vehicles; 자율항법시스템

URI
http://hdl.handle.net/10203/196189
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=591870&flag=dissertation
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0