A direct visual servoing-based framework for the 2016 IROS Autonomous Drone Racing Challenge

Cited 41 time in webofscience Cited 0 time in scopus
  • Hit : 565
  • Download : 0
This paper presents a framework for navigating in obstacle-dense environments as posed in the 2016 International Conference on Intelligent Robots and Systems (IROS) Autonomous Drone Racing Challenge. Our framework is based on direct visual servoing and leg-by-leg planning to navigate in a complex environment filled with many similar frame-shaped obstacles to fly through. Our indoor navigation method relies on the velocity measurement by an optical flow sensor since the position measurements from GPS or external cameras are not available. For precision navigation through a sequence of obstacles, a center point–matching method is used with the depth information from the onboard stereo camera. The guidance points is directly generated in three-dimensional space using the two-dimensional image data to avoid accumulating the error from the sensor drift. The proposed framework is implemented on a quadrotor-based aerial vehicle, which carries an onboard vision-processing computer for self-contained operation. Using the proposed method, our drone was able to finished in first place in the world-premier IROS Autonomous Drone Racing Challenge.
Publisher
WILEY-BLACKWELL
Issue Date
2018-01
Language
English
Article Type
Article
Citation

JOURNAL OF FIELD ROBOTICS, v.35, no.1, pp.146 - 166

ISSN
1556-4959
DOI
10.1002/rob.21743
URI
http://hdl.handle.net/10203/238752
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 41 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0