Spectral-domain convolution engines can effectively reduce the computational complexity of convolution operations. In these engines, however, element-wise multiplications of the spectral representations dominate the multiply and accumulate (MAC) operations. In light of this, we propose bin-specific quantization (BSQ), which is to judiciously allocate varying bit width to each spectral bin in overlap-save. This allows efficient computation of the Hadamard product since the magnitude of the high-frequency components in image features is significantly smaller than that of the low-frequency counterparts. Using the statistics from spectral representations of feature maps, we also delineate methods for properly allocating bit precision to those spectral bins. When BSQ is applied, the average bit precisions of the arithmetic operators in spectral-domain convolvers, without the requirement of network re-training, were lowered by 24 % (AlexNet), 20% (VGG-16), and 22% (ResNet-18) while having no significant reduction (< 1%) on classification accuracy on the ImageNet dataset.