Radar-based navigation for autonomous driving in harsh environment극한 환경에서의 자율 주행을 위한 레이더 기반 항법

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 157
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorRyu, Jee-Hwan-
dc.contributor.advisor유지환-
dc.contributor.authorPark, Yeong Sang-
dc.date.accessioned2023-06-21T19:32:51Z-
dc.date.available2023-06-21T19:32:51Z-
dc.date.issued2022-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=996228&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/307773-
dc.description학위논문(박사) - 한국과학기술원 : 건설및환경공학과, 2022.2,[vii, 67 p. :]-
dc.description.abstractAchieving general motion estimation in harsh environments has been a key challenge in robotics. Previously, LiDAR and camera sensors, which are mainly used in robotics, have high accuracy and a large amount of information, but are not robust in harsh environments. Measurements that are not robust cannot be used because they cause significant problems in estimating the pose. Radar sensors measure robustly in harsh environments, and are widely used in defense, maritime, weather observation, and automotive. In this thesis, we propose pose estimation methods using radar sensors, and prove the superiority of performance through comparison with existing sensors in harsh environments. The technical summary of this dissertation is as follows. First, we propose a localization and mapping algorithm that leverages a radar system in low-visibility environments. We aim to address disaster situations in which prior knowledge of a place is available from CAD or LiDAR maps, but incoming visibility is severely limited. In smoky environments, typical sensors (e.g., cameras and LiDARs) fail to perform reliably due to the large particles in the air. Radars recently attracted attention for their robust perception in low-visibility environments-
dc.description.abstractthey detect and match salient features within a radar image. Differing from existing feature-based methods, this thesis on a method using direct radar odometry, PhaRaO, which infers relative motion from a pair of radar scans via phase correlation. Specifically, we apply the FMT for Cartesian and log-polar radar images to sequentially estimate rotation and translation. In doing so, we decouple rotation and translation estimations in a coarse-to-fine manner to achieve real-time performance. The proposed method is evaluated using large-scale radar data obtained from various environments. The inferred trajectory yields a 2.34% (translation) and 2.93 degree (rotation) Relative Error (RE) over a 4km path length on average for the odometry estimation. Finally, we devised a custom sensor system for 3D velocity estimation by compositing two orthogonal automotive radar sensors. By perceiving beyond the visible spectrum, the proposed radar system measures velocity within all visibility conditions regardless of the structureness of the environment. Furthermore, we introduce a novel radar instant velocity factor for pose-graph SLAM framework and solve for 3D ego-motion in the integration with IMU. The validation reveals that the proposed method can be applied to estimate general 3D motion in both indoor and outdoor, regardless of the visibility and the structureness in the environment. Additionally, using these results, radar-LiDAR fusion SLAM operating in unstructured environments was also proposed. The performance was improved by performing LiDAR mapping using the 3D ego-motion obtained from the radar sensor as an initial value, and loop closing was used to correct the drift error. It was shown that the proposed method can generate maps and routes with minimal drift error even in large scale unstructured environments.-
dc.description.abstracthowever, radar measurements’ angular ambiguity and low resolution prevented the direct application to the SLAM framework. In this thesis, we propose registering radar measurements against a previously built dense LiDAR} map for localization and applying radar-map refinement for mapping. Our proposed method overcomes the significant density discrepancy between LiDAR and radar with a density-independent point registration algorithm. We validate the proposed method in an environment containing dense fog. Secondly, this study was conducted to estimate the odometry using radar data output from a high-performance 2D scanning radar. Previous scanning radar-based odometry methods are mostly feature-based-
dc.languageeng-
dc.publisher한국과학기술원-
dc.titleRadar-based navigation for autonomous driving in harsh environment-
dc.title.alternative극한 환경에서의 자율 주행을 위한 레이더 기반 항법-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :건설및환경공학과,-
dc.contributor.alternativeauthor박영상-
Appears in Collection
CE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0