Google may be known as the king of computational photography, but Apple is now looking to play catch with its new “Deep Fusion” photography system. The company, at the iPhone 11 lineup launch last month, announced that the Deep Fusion feature will perk up camera performance, but it wasn’t available when the phones went on sale.

Well, you wouldn’t have to wait a long time for Deep Fusion to find its way to your iPhone 11 as it has already been made available for testing via the latest iOS 13 public beta. The feature is hidden from users and you won’t find an option to enable Deep Fusion in your iPhone 11 or 11 Pro camera app. It’s a hidden feature that works in the background.

Deep Fusion is different from Smart HDR and Night Mode, both of which are designed to handle extremely bright and low-light scenarios. It’s an “AI-driven image processing feature” (enabled by the Neural Engine in A13 Bionic) which is going to offer you more detailed shots in certain conditions.

“Deep Fusion uses advanced machine learning to do pixel-by-pixel processing of photos, optimizing for texture, details, and noise in every part of the photo,” says Apple.

If you don’t already know how Deep Fusion works, well, it’s a step above Smart HDR. The iPhone 11 will capture three frames at a fast shutter speed even before you’ve clicked the shutter button. When you hit the shutter button, the camera app captures 3 more frames and a long exposure shot for additional details.

Then, these three frames and the long exposure shot are merged to create a “synthetic long” shot, which is then merged with one of the initial frames with the most detail. The two final shots are merged, de-noised, and then it undergoes a 4-step detail processing process to create the final image.

This software-driven camera feature is going to help iPhone 11 users take more detailed pictures, but is Deep Fusion enough to take on the king of smartphone photography? We will know for sure on 15th October, which is when Google Pixel 4 goes official.