How Apple’s Deep Fusion camera tech makes photography on iPhone 11 better than before

By: |
New Delhi | Updated: October 3, 2019 6:41:36 PM

Deep Fusion, Apple's new image processing technology is now available on iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max through the latest iOS 13 developers beta and public beta releases.

iphone 11, iphone 11 pro, iphone 11 pro max, apple, ios 13, deep fusion iphone 11In the Apple event in September, Phil Schiller described Deep Fusion as “computational photography mad science”. (Source: Apple website)

Deep Fusion, Apple’s new image processing technology is now available on iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max through the latest iOS 13 developers beta and public beta releases. Deep Fusion feature comes as part of the iOS 13.2 developer and public beta upgrades and will work with iOS devices running an A13 Bionic processor. Only the newly launched iPhones have this processor.

As for Apple’s Deep Fusion feature, it works in a different way than Smart HDR. Deep Fusion leverages the power of the Neural Engine on the A13 Bionic chipset and works at the pixel level by using complex machine learning algorithms to enhance images.

In the Apple event in September, Phil Schiller described Deep Fusion as “computational photography mad science.”

How to enable Deep Fusion on your iPhone 11

Well, Apple wants its users to rely on this new technology without thinking too much of it. Probably, that’s why there’s no button to turn it on or off, or really any indication that you’re even using the mode.

Also read | iOS 13 with new India-centric features now available: How to download, install on iPhone, iPod

As of now, the default camera mode on an iPhone 11, iPhone 11 Pro or iPhone 11 Pro Max is Smart HDR. Even when you click just one photo, the Smart HDR takes a series of images before and after your shot and then it blends them together to improve the dynamic range and detail.

If the photo is clicked in dark environment, the iPhone camera switches automatically into Night Mode to improve brightness and reduce image noise.

Also read | Day after iPhone 11 launch, Apple slashes price of older models: Here’s the latest price list

With Deep Fusion, anytime you take a photo indoors (or in another medium to low light conditions), the camera will automatically switch into the mode to lower image noise and optimise detail. With most of the photos being shot in medium-to-low light situations like indoors, Deep Fusion will have an enormous impact on the quality of photos you click.

How Deep Fusion technology works

Both Smart HDR and Deep Fusion capture multiple images and then the iPhone picks a reference photo that’s meant to stop motion blur as much as possible. It then combines three standard and one long exposure into a single “synthetic” long photo. Here is when Deep Fusion comes into play. Deep Fusion breaks down the reference image and synthetic long photo into multiple regions, identifying skies, walls, textures and fine details (like hair).

After this quick breakdown, the software does a pixel-by-pixel analysis of the two photos – that’s 24 million pixels in total. Lastly, the results of the analysis are used to determine which pixels to use and optimise in building a final image. This whole information is captured and processed in the background once iPhone’s A13 processor has a chance.

Looks complex? Well, Apple says that the entire process takes a second or so to happen. This means you don’t need to wait before clicking next shot.

Get live Stock Prices from BSE and NSE and latest NAV, portfolio of Mutual Funds, calculate your tax by Income Tax Calculator, know market’s Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.