Integrated RGB-D imaging for efficient 3D scene understanding효율적인 3차원 장면 이해를 위한 통합 RGB-D 이미징

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 3
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisor김민혁-
dc.contributor.authorMeuleman, Andreas-
dc.contributor.author므르만 아드레아-
dc.date.accessioned2024-07-26T19:31:01Z-
dc.date.available2024-07-26T19:31:01Z-
dc.date.issued2023-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1047414&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/320987-
dc.description학위논문(박사) - 한국과학기술원 : 전산학부, 2023.8,[vi, 69 p. :]-
dc.description.abstractThe objective of RGB-D imaging is to capture a color image associated with a depth or distance map. Capturing a scene's appearance and geometry information has become an essential imaging modality for applications in robotics, autonomous driving, and virtual and augmented reality. However, accurate RGB-D imaging systems often rely on multiple sensors, several shots, and/or high-power active illumination, resulting in increased hardware and computational costs. This thesis presents imaging systems designed to recover the color and geometry of a scene under challenging constraints. The objective is to develop relevant RGB-D imaging approaches that enable higher-quality geometry and appearance acquisition at lower hardware and/or computational costs. Specifically, we attempt to minimize the number of sensors, reduce the use of high-power active illumination, and/or demonstrate real-time performance. In this context, we develop three methods for RGB-D imaging in constrained scenarios. First, we present a monocular single-shot RGB-D imaging system based on uneven double refraction. Using a polarizer and a birefringent medium, we achieve real-time RGB-D imaging from a single RGB sensor. Second, we introduce an integrated omnidirectional system from fisheye images that provides RGB-D panoramas in real-time on an embedded board. Lastly, we propose a ToF/Stereo fusion method using mobile phone sensors. We accommodate the unknown lens motion and noisy low-power Time-of-Flight (ToF) module of the battery-powered device through per-snapshot calibration and effective inclusion of the ToF samples into the stereo correlation volume. Overall, this thesis contributes to advancements in RGB-D imaging, offering practical solutions for efficient 3D scene understanding under various constraints.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectRGB-D 이미징▼a스테레오▼a360 이미징▼a카메라 보정▼a센서 퓨전▼a비행 시간-
dc.subjectRGB-D imaging▼aStereo▼a360 imaging▼aCamera calibration▼aSensor fusion▼aTime of flight-
dc.titleIntegrated RGB-D imaging for efficient 3D scene understanding-
dc.title.alternative효율적인 3차원 장면 이해를 위한 통합 RGB-D 이미징-
dc.typeThesis(Ph.D)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :전산학부,-
dc.contributor.alternativeauthorKim, Min Hyuk-
Appears in Collection
CS-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0