Apple Engineers Reveal iPhone 12 Camera Design Process

In a new interview on PetaPixel, two Apple engineers have revealed the Cupertino company’s plan behind iPhone camera design philosophy.

Francesca Sweet, product line manager for the iPhone, and Jon McCormack, vice president of camera software engineering, emphasized that they work across the whole design in order to simplify taking photos.

“As photographers, we tend to have to think a lot about things like ISO, subject motion, et cetera,” Job McCormack said. “And Apple wants to take that away to allow people to stay in the moment, take a great photo, and get back to what they’re doing.”

“It’s not as meaningful to us anymore to talk about one particular speed and feed of an image, or camera system,” he continued. “We think about what the goal is, and the goal is not to have a bigger sensor that we can brag about.”

“The goal is to ask how we can take more beautiful photos in more conditions that people are in,” he said. “It was this thinking that brought about Deep Fusion, Night Mode, and temporal image signal processing.”

Apple’s goal is to automatically “replicate as much as we can… what the photographer will [typically] do in post.” With Machine Learning, for example, Apple’s camera break down an image into individual elements it can then process.

“The background, foreground, eyes, lips, hair, skin, clothing, skies,” lists McCormack. “We process all these independently like you would in [Adobe] Lightroom with a bunch of local adjustments. We adjust everything from exposure, contrast, and saturation, and combine them all together.”

Sweet also mentions the benefits to having a larger camera sensor. “The new wide camera [of the iPhone 12 Pro Max], improved image fusion algorithms, make for lower noise and better detail.”

“With the Pro Max we can extend that even further because the bigger sensor allows us to capture more light in less time, which makes for better motion freezing at night,” she continued.

The engineers believe that Apple’s control over every element, from lens to software, is vital.

“We don’t tend to think of a single axis like ‘if we go and do this kind of thing to hardware’ then a magical thing will happen,” said McCormack. “Since we design everything from the lens to the GPU and CPU, we actually get to have many more places that we can do innovation.”

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.