Perception, Guidance and Navigation for Indoor Autonomous Drone Racing using Deep Learning

Cited 95 time in webofscience Cited 0 time in scopus
  • Hit : 889
  • Download : 0
In autonomous drone racing, a drone is required to fly through the gates quickly without any collision. Therefore, it is important to detect the gates reliably using computer vision. However, due to the complications such as varying lighting conditions and gates seen overlapped, traditional image processing algorithms based on color and geometry of the gates tend to fail during the actual racing. In this letter, we introduce a convolutional neural network to estimate the center of a gate robustly. Using the detection results, we apply a line-of-sight guidance algorithm. The proposed algorithm is implemented using low cost, off-the-shelf hardware for validation. All vision processing is performed in real time on the onboard NVIDIA Jetson TX2 embedded computer. In a number of tests our proposed framework successfully exhibited fast and reliable detection and navigation performance in indoor environment.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2018-07
Language
English
Article Type
Article
Citation

IEEE ROBOTICS AND AUTOMATION LETTERS, v.3, no.3, pp.2539 - 2544

ISSN
2377-3766
DOI
10.1109/LRA.2018.2808368
URI
http://hdl.handle.net/10203/242172
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 95 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0