In this letter, we propose a multi-spectral unsupervised domain adaptation for thermal image semantic segmentation. The proposed framework aims to address the data scarcity problem and boost segmentation performance in the thermal domain with the help of existing large-scale RGB datasets and segmentation knowledge from an RGB image segmentation network. We also enhance the generalization capability of our thermal segmentation network with pixel-level domain adaptation bridging day and night thermal image domains. With our framework, a thermal image segmentation network can achieve high performance without any ground-truth labels by exploiting successive multi-spectral knowledge transfers including RGB-to-RGB, RGB-to-Thermal, and Thermal-to-Thermal adaptations. Moreover, we provide a real-world RGB-Thermal semantic segmentation dataset with 950 manually annotated Cityscapes-style ground-truth labels in 19 classes. Experimental results on real-world datasets demonstrate the effectiveness and robustness of the proposed framework quantitatively and qualitatively.