Google Camera App on the Pixel 2 / 2 XL Doesn’t Leverage the Pixel Visual Core

Google Pixel 2's Camera Fails Miserably Under LED Lighting

If you thought that the hidden image-processing chip called the Pixel Visual Core is at least partly to thank for the great imaging chops of the Pixel 2 and the Pixel 2 XL, think again. That’s because according to Google’s VP of product management, Brian Rakowski, Google’s own camera app doesn’t actually leverage the power of the Pixel Visual Core to process images.

Rakowski had actually revealed this in an interview to FoneArena last November, when he was asked how the Pixel Visual Core helps with image processing on the second-generation Pixel smartphones. This is what he had to say:

“Turns out we do pretty sophisticated processing, optimising and tuning in the camera app itself to get the maximum performance possible. We do ZSL and fast buffering to get fast HDR capture. So we don’t take advantage of the Pixel Visual Core, we don’t need to take advantage of it”.

While the interview, which discusses the then-newly-launched Pixel 2 devices in great detail, had slipped under the radar until now, it was revived yesterday after ArsTechnica’s Ron Amadeo tweeted about Google reiterating to him that the Pixel 2 devices don’t actually use the additional chip for the Camera app.

It was developer Evan Munee (@emunee) who then dug out the old Rakowski interview to let everyone know that Google had already revealed the anomaly last year itself.

In case you’re interested in seeing the Pixel Visual Core in action, here’s an example:

Same picture taken without (left) and with HDR+ on Pixel Visual Core (right).

Either way, Instagram, WhatsApp and Snapchat are said to be the first high-profile apps to have started using the chip’s processing prowess.

Comments 0
Leave a Reply

Loading comments...