The Visual Attention Engine(VAE), an 80x60 digital Cellular Neural Network, rapidly extracts global features used as attentional cues to streamline detailed object recognition. A peak performance of 24GOPS is achieved by 120 processing elements (PE) shared by the cells. 2D Shift register based data transactions enable 93% PE utilization. Integrated within an object recognition SoC, the 4.5mm2 VAE running at 200MHz improves object recognition frame rate by 83% while consuming just 84mW.