The new Galaxy A7 (2018), which was announced earlier today, holds the distinction of being the first Samsung smartphone to feature triple rear cameras. It is quite surprising for Samsung to launch a mid-range smartphone which packs such a distinctive feature that has so far been exclusive to flagship devices such as Huawei’s P20 Pro, or the Oppo R17 Pro.

But are the Galaxy A7’s triple rear cameras merely a marketing ploy to revive the sales of Samsung’s Galaxy A-series devices? Well, we can’t be sure. But if you are wondering how a triple rear camera can improve the imaging output, then let’s get straight to the hardware and see how it works.

Primary Camera

The Galaxy A7’s imaging hardware has a 24MP RGB sensor with a F/1.7 aperture, which sits in the middle of the vertical stack of lenses. Above it is a 5MP depth sensor with F/2.2 aperture, while an 8MP wide-angle lens with F/2.4 aperture and a 120-degree field of view sits at the bottom. On the other hand, Huawei’s P20 Pro packs a 40MP RGB sensor, an 8MP telephoto lens and a 20MP monochrome lens.

Coming to the practical applications of these sensors, the Galaxy A7’s 24MP RGB sensor acts as the main camera and is also capable of pixel-binning (combining four pixels into one) for extracting more detail from the scene in the final image, particularly in low-light conditions.

AI Scene Optimizer

The main camera gets the Scene Optimizer feature which was first seen in the Galaxy Note 9. This allows the camera to use AI algorithms to automatically recognise scenes and tweak camera settings such as the white balance, exposure, contrast and brightness values to match the scene. Currently, the feature can identify 19 scenes including food, landscapes, street view, night scene, animals and beach among others.

Scene Optimizer really makes a difference when it comes to overall image quality as we have seen in our review of the feature in the Note 9.

Secondary Camera

The 5MP (F/2.2) sensor sits on the top of the stack, and is used to capture depth information in the scene. The camera calculates the distance between different objects in its view, so that it can separate the foreground and background and create a better depth of field effect for bokeh shots. Obviously, the SoC helps in depth calculation and the blur effect is applied at the edges to make the object stand out. Samsung could also let users adjust the blur intensity, as is seen in its flagships.

You can argue that a monochrome sensor is better for depth sensing because it can capture more light, but secondary monochrome lenses with a high resolution are more expensive to procure, which is why they are usually seen on high-end devices.

Ultra-Wide Camera

Finally, there is an 8MP (F/2.4) ultra-wide sensor with a 120-degree field of view which is nearly the same as the regular FOV of human eyes. So, on paper, the Galaxy A7’s 8MP ultra-wide lens will let you capture a shot of everything that is in your view.

The wide-angle lens comes into play when you want to capture a group shot or a scenery and want the image to be as wide as possible to get the whole scene. The f/2.4 lens has become a mainstay on upper mid-range devices with dual camera setups when it comes to capturing wide-angle shots, and will serve the Galaxy A7 users just fine.

So there you have it. With everything Samsung has revealed so far, the Galaxy A7 might actually have a really great camera. Obviously, there’s a lot of testing to be done. We don’t even have official camera samples yet, so clearly Samsung is just announcing the specs at this time. Stay tuned to find out more about this exciting new Samsung Galaxy A series phone.