Apple Shares iPhone 11 Pics Taken with ‘Deep Fusion’ in iOS 13 Beta

Apple detailed its Deep Fusion camera technology in the iPhone 11 and iPhone 11 Pro at its September event, but said it wasn’t ready at launch.

Now, the company has shared new photos samples taken by Deep Fusion and shared them in advance to media outlets such as The Verge and TechCrunch.

Deep Fusion is coming to a developer beta of iOS 13 today (it’s supposed to be available shortly). The new feature will only be available on devices with an A13 chip, which means iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max. Sorry, you’ll need to buy a new iPhone to play with Deep Fusion.

Apple previously said, “Deep Fusion, coming later this fall, is a new image processing system enabled by the Neural Engine of A13 Bionic. Deep Fusion uses advanced machine learning to do pixel-by-pixel processing of photos, optimizing for texture, details and noise in every part of the photo.”

According to TechCrunch, here’s the workflow for Deep Fusion:

The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.

This produces two 12MP photos which are combined into one 24MP photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.




Machine Learning takes over and then determines every detail of the photo at a pixel-by-pixel level (24 million pixels), churning out a highly detailed photo (made from 9 images), in about one second (longer than a Smart HDR photo says Apple), happening automatically.

Check out some of the sample photos below shared by Apple:

Deep fusion sample 1

Deep fusion sample 2

Google is expected to announce its new Pixel 4 smartphone soon and the latter’s camera will most likely compete with iPhone 11 and iPhone 11 Pro. This appears to be Apple’s move to stay one steep ahead of whatever may emerge from Google’s Pixel 4.

You can jump to the 1:22 mark to hear Apple executive Phil Schiller explain Deep Fusion, which he explained as “computational photography mad science!”:

YouTube video

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.