Apple launched three new iPhones this year and all three are powered by the company’s latest A12 Bionic chipset. Built on a 7nm process, the A12 Bionic chipset features a next-gen neural engine that unlocks quite a few new features in the camera app.

Apple spoke at length about the all new Smart HDR feature that the company claims is capable of delivering “photos with high dynamic range and great image detail, advanced bokeh quality in Portrait mode photos and dynamic depth of field that is user adjustable in the Photos app.”

Smart HDR

Now that might not immediately make it clear, so if you’re still wondering what exactly this Smart HDR is, then you are in the right place.

A Little Help From Neural Networks

Simply put, Smart HDR will allow users to click better pictures that have little to no loss of detail due to overexposure or shadows – you won’t get pictures which look black in the darker areas or a blinding white in the well-let areas. In order to make that happen, the new iPhones will instantaneously click several images of the subject in various exposures. Now this is standard HDR which has been used for years. Smart HDR uses machine learning to create even more frames and then combines them both for the best final image.

Smart HDR

So, It’s Like Google HDR+?

In case that sounds familiar, then let me remind you that the Google Pixel also has a similar feature, however, it works in a slightly different way. Instead of combining the best parts of multiple images, Google’s implementation combines several underexposed photos to deliver the best image. Google also uses a dedicated Pixel Visual Core co-processor as well as its machine learning algorithms to make changes in the final image in real-time. You can see the difference with HDR+ off (left) and on below. It’s huge and Smart HDR also promises to deliver similar results.

Image: Google
Image: Google

Beating the Pixel 2?

Since both the implementations are quite different, it would be quite interesting to see how the new iPhones match up with the Pixel devices. Even though the Google Pixel series has been a dominant force in smartphone cameras ever since its launch, there is a decent chance that Apple might take the top spot this year with its new Smart HDR feature.

Depth Control

Other than the Smart HDR feature, the new iPhones will also include something else that the Pixel devices lack. The 2018 iPhones have Depth Control, which allows users to manually adjust a photo’s image depth of field after capturing it, which might result in more natural looking images. This is something that we’ve previously seen in some Android devices like the Samsung Galaxy S9/S9+ and the recently launched Vivo V11 Pro.