Fast Omnidirectional Depth Densification

Cited 0 time in webofscience Cited 2 time in scopus
  • Hit : 188
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorJang, Hyeonjoongko
dc.contributor.authorJeon, Danielko
dc.contributor.authorHa, Hyunhoko
dc.contributor.authorKim, Min Hyukko
dc.date.accessioned2019-12-13T09:30:29Z-
dc.date.available2019-12-13T09:30:29Z-
dc.date.created2019-11-11-
dc.date.created2019-11-11-
dc.date.created2019-11-11-
dc.date.issued2019-10-08-
dc.identifier.citation14th International Symposium on Visual Computing (ISVC), pp.683 - 694-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10203/269271-
dc.description.abstractOmnidirectional cameras are commonly equipped with fish-eye lenses to capture 360-degree visual information, and severe spherical projective distortion occurs when a 360-degree image is stored as a two-dimensional image array. As a consequence, traditional depth estimation methods are not directly applicable to omnidirectional cameras. Dense depth estimation for omnidirectional imaging has been achieved by applying several offline processes, such as patch-matching, optical flow, and convolutional propagation filtering, resulting in additional heavy computation. No dense depth estimation for real-time applications is available yet. In response, we propose an efficient depth densification method designed for omnidirectional imaging to achieve 360-degree dense depth video with an omnidirectional camera. First, we compute the sparse depth estimates using a conventional simultaneous localization and mapping (SLAM) method, and then use these estimates as input to a depth densification method. We propose a novel densification method using the spherical pull-push method by devising a joint spherical pyramid for color and depth, based on multi-level icosahedron subdivision surfaces. This allows us to propagate the sparse depth continuously over 360-degree angles efficiently in an edge-aware manner. The results demonstrate that our real-time densification method is comparable to state-of-the-art offline methods in terms of per-pixel depth accuracy. Combining our depth densification with a conventional SLAM allows us to capture real-time 360-degree RGB-D video with a single omnidirectional camera.-
dc.languageEnglish-
dc.publisherSpringer-
dc.titleFast Omnidirectional Depth Densification-
dc.typeConference-
dc.identifier.wosid000582481300053-
dc.identifier.scopusid2-s2.0-85076169317-
dc.type.rimsCONF-
dc.citation.beginningpage683-
dc.citation.endingpage694-
dc.citation.publicationname14th International Symposium on Visual Computing (ISVC)-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationLake Tahoe, Nevada-
dc.identifier.doi10.1007/978-3-030-33720-9_53-
dc.contributor.localauthorKim, Min Hyuk-
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0