Apple’s Portrait Mode on the iPhone 7 Plus is the Camera’s Best Feature


With the iPhone 7, Apple is taking mobile photography to the next level. The 4.7-inch iPhone 7 received some nice camera upgrades, however the 5.5-inch iPhone 7 Plus is truly a mobile photographers dream.

The iPhone 7 Plus is packed with two 12-megapixel sensors, one a wide angle lens and the other a telephoto lens. Apple’s dual camera system is capable of taking absolutely stunning photos with brilliant color accuracy and crisp detail.

With the iPhone 7 Plus sporting two camera sensors, Apple’s flagship iPhone is capable of all sorts of photography tricks, including a new “Portrait” mode. The feature blurs the background of a photo and causes the main subject in the photo to really stand out.

Apple is able to achieve this because its dual camera system is capable of measuring varying degrees of depth within a photo. As a result, Apple’s software does its magic and applies an artificial blur to the background while bringing the subject into sharper focus.

Blog prolost has called the Portrait mode feature “legit.” All of the effects are achieved by applying techniques from computational photography.

“I’m pretty sure that with only two cameras, you can’t build a useful light field. But can you do computational photography? That’s a trick question, as the iPhone, and many other mobile phone cameras, are already doing computational photography. Already the iPhone will automatically perform an HDR merge of two exposures, for example. But even when the iPhone snaps a single, non-HDR exposure, the amount of post-processing it does is considerable.”


The feature is available in the new iOS 10.1 beta for anyone interested in checking it out. Here is what the blog says about the feature:

“Apple calls their initial two-lens computational photo offering Portrait Mode, and the most recent developer beta of iOS 10.1 includes a beta version of it. Under the right circumstances, this mode enables a so-called “Depth Effect,” where both cameras fire simultaneously, and a depth map is built based on the subtle stereo disparity between the captured images. This nine-level depth map is used to drive a variable-radius blur. The result is a photo with simulated shallow depth of field.

This process can never be perfect, but can it be good enough?

Oh hell yes it can.”

Be sure to read the full post about Portrait mode on the iPhone 7 Plus running iOS 10.1 beta. If you are running the beta and you have an iPhone 7 Plus, what do you think of the Portrait feature? Let us know in the comments below.

A software engineer with a passion for creation and innovation using technology. To learn more about me, check out my personal website, which contains links to my projects. Email:

  • Quattro

    It’s amazing how the guy in the source post is trying so desperately to justify a poor quality “effect”. He’s is just so wrong on his assertions that, despite his background, he just comes across as a totally unqualified hack.

  • Kris

    a good read, thanks for posting the link Nick

  • Tim

    Apple is over selling what is essntially a software effect rather some kind of groundbreaking leap in phone photography. Other phones, with one or two cameras, have been doing this for some time. Without looking I’d wager there’s already something in the App Store that can do this. I’m not impressed and glad I didn’t upgrade for the new camera system. If it starts to do video on the fly, I’ll take another look.

  • Verne Arase

    I don’t think so.

    Up until now, I don’t think any phone has had the oomph to compute nine different depth fields and apply increasing blur to each depth field.

    The others in the past just applied one blur to the entire background (which made it less convincing).

    In fact, even with modern processors I’m not sure other will have the ability to duplicate the feat – at least without Apple silicon to run it on.