In this thesis, we introduce data-driven approach with RGB guidance utilized for refinement of two types of image, which are depth and hyperspectral images. Depth maps captured by consumer-level depth cameras such as Kinect are usually degraded by noise, missing values, and quantizations. Hyperspectral images usually lack of spatial resolution due to limitations of hardware design of imaging sensors. On the contrary, latest imaging sensors can capture a RGB image with resolution of multiple times larger than a hyperspectral image.
For refining degraded RAW depth maps that are coupled with an RGB image, our approach takes advantage of a training set of high-quality depth data and transfer its information to the RAW depth map through multi-scale dictionary learning.
Utilizing a sparse representation, our method learns a dictionary of geometric primitives which captures the correlation between high-quality mesh data, RAW depth maps and RGB images.
The dictionary is learned and applied in a manner that accounts for various practical issues that arise in dictionary-based depth refinement. Compared to previous approaches that only utilize the correlation between RAW depth maps and RGB images, our method produces improved depth maps without over-smoothing. Since our approach is data driven, the refinement can be targeted to a specific class of objects by employing a corresponding training set. In our experiments, we show that this leads to additional improvements in recovering depth maps of human faces.
And, for hyperspectral image enhancement and super-resolution, we propose an algorithm utilizing two stages: spatial upsampling stage and spectrum substitution stage.
The spatial upsampling stage is guided by a high resolution RGB image of the same scene, and the spectrum substitution stage utilizes sparse coding to locally refine the upsampled hyperspectral image through dictionary substitution. Experiments show that our algorithm is highly effective and has outperformed state-of-the-art matrix factorization based approaches.