DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jung, Raehyuk | ko |
dc.contributor.author | Cho, Sungmin | ko |
dc.contributor.author | Kwon, Junseok | ko |
dc.date.accessioned | 2021-10-21T08:50:20Z | - |
dc.date.available | 2021-10-21T08:50:20Z | - |
dc.date.created | 2021-10-19 | - |
dc.date.created | 2021-10-19 | - |
dc.date.created | 2021-10-19 | - |
dc.date.issued | 2020-09 | - |
dc.identifier.citation | 2020 IEEE International Conference on Image Processing, ICIP 2020, pp.1058 - 1062 | - |
dc.identifier.issn | 1522-4880 | - |
dc.identifier.uri | http://hdl.handle.net/10203/288300 | - |
dc.description.abstract | We present a novel method for the upright adjustment of 360 degrees. images. Our network consists of two modules, which are a convolutional neural network (CNN) and a graph convolutional network (GCN). The input 360 degrees. images is processed with the CNN for visual feature extraction, and the extracted feature map is converted into a graph that finds a spherical representation of the input. We also introduce a novel loss function to address the issue of discrete probability distributions defined on the surface of a sphere. Experimental results demonstrate that our method outperforms fully connected-based methods. | - |
dc.language | English | - |
dc.publisher | IEEE | - |
dc.title | UPRIGHT ADJUSTMENT WITH GRAPH CONVOLUTIONAL NETWORKS | - |
dc.type | Conference | - |
dc.identifier.wosid | 000646178501032 | - |
dc.identifier.scopusid | 2-s2.0-85098633453 | - |
dc.type.rims | CONF | - |
dc.citation.beginningpage | 1058 | - |
dc.citation.endingpage | 1062 | - |
dc.citation.publicationname | 2020 IEEE International Conference on Image Processing, ICIP 2020 | - |
dc.identifier.conferencecountry | AR | - |
dc.identifier.conferencelocation | Virtual | - |
dc.identifier.doi | 10.1109/ICIP40778.2020.9190715 | - |
dc.contributor.localauthor | Jung, Raehyuk | - |
dc.contributor.nonIdAuthor | Cho, Sungmin | - |
dc.contributor.nonIdAuthor | Kwon, Junseok | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.