DC Field | Value | Language |
---|---|---|
dc.contributor.author | Khan, Muhammad Umar Karim | ko |
dc.contributor.author | Khan, Asim | ko |
dc.contributor.author | Kyung, Chong-Min | ko |
dc.date.accessioned | 2023-06-14T11:00:34Z | - |
dc.date.available | 2023-06-14T11:00:34Z | - |
dc.date.created | 2023-06-08 | - |
dc.date.created | 2023-06-08 | - |
dc.date.issued | 2016-10 | - |
dc.identifier.citation | 2016 IEEE Asia Pacific Conference on Circuits and Systems, APCCAS 2016, pp.440 - 443 | - |
dc.identifier.uri | http://hdl.handle.net/10203/307288 | - |
dc.description.abstract | Numerous depth extraction schemes cannot extract depth on textureless regions, thus generating sparse depth maps. In this paper, we propose using perception cues to improve the sparse depth map. We consider the local neighborhood as well the global surface properties of objects. We use this information to complement depth extraction schemes. The method is not scene or class specific. With quantitative evaluation, the proposed method is shown to perform better compared to previous depth refinement methods. The error in terms of standard deviation of depth has been reduced down by 60%. The computational overhead of the proposed method is also very low, making it a suitable candidate for depth refinement. | - |
dc.language | English | - |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | - |
dc.title | Depth refinement on sparse-depth images using visual perception cues | - |
dc.type | Conference | - |
dc.identifier.wosid | 000392651200115 | - |
dc.identifier.scopusid | 2-s2.0-85011103184 | - |
dc.type.rims | CONF | - |
dc.citation.beginningpage | 440 | - |
dc.citation.endingpage | 443 | - |
dc.citation.publicationname | 2016 IEEE Asia Pacific Conference on Circuits and Systems, APCCAS 2016 | - |
dc.identifier.conferencecountry | KO | - |
dc.identifier.conferencelocation | Jeju Island | - |
dc.identifier.doi | 10.1109/APCCAS.2016.7803997 | - |
dc.contributor.localauthor | Kyung, Chong-Min | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.