Weakly Supervised Semantic Segmentation Using Color Adjacency Loss

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 206
  • Download : 0
Large amount of training data is essential for deep learning-based computer vision tasks. However, in semantic segmentation, annotating pixel-wise labels for large-scale image data is laborious and time-consuming. To handle this problem, we propose a training framework for a CNN-based network using sparse labels. We propagate the sparse labels to produce the same performance as training on dense labels in the segmentation network. For effective label propagation, we take advantage of the observation that adjacent pixels sharing similar colors would be in the same class. Based on this insight, the label is propagated by our adjacency loss depending on the color similarity between the adjacent pixels. We perform on the PASCAL VOC 2012 dataset using scribbles annotations as sparse labels. The proposed algorithm achieves superior performance compared to the previous method in weakly supervised semantic segmentation task.
Publisher
IEEE
Issue Date
2018-12-12
Language
English
Citation

33rd IEEE International Conference on Visual Communications and Image Processing (IEEE VCIP)

DOI
10.1109/VCIP.2018.8698643
URI
http://hdl.handle.net/10203/247696
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0