Image inpainting technique has recently been developed significantly by using deep neural networks and adopted in many real-world applications, such as removing obstacles, restoring damaged objects, and retouching photos. However, image inpainting networks using deep learning suffer from high parameter complexities and long inference time. While there are some efforts to compress image-to-image translation GAN, compressing image inpainting networks has rarely been explored. In this paper, our purpose is to create a small efficient model by compressing the generator of the inpainting model without sacrificing image quality. We first proposed novel channel pruning and knowledge distillation techniques that are specialized for image inpainting models using mask information. We selected channels sensitive to mask areas and trained a pruned inpainting model using three knowledge distillation methods that receive information from large inpainting models. We compressed the representative inpainting model, GLCIC, and reduced the model size and operations by more than 10x with only small loss. Extensive experimental results demonstrated the effectiveness of our proposed methods in image inpainting tasks. To the best of our knowledge, our work is the first attempt to compress an image inpainting model.