Apple Execs Talk iPhone 13 Camera System Changes in New Podcast [VIDEO]

Tyler Stalman interviewed some Apple executives in the latest episode of The Stalman Podcast about the iPhone 13 camera system.

Published on Monday, the “iPhone 13: Talking to the Camera Engineers” episode of the The Stalman Podcast features a trio of Apple representatives. The group is headed up by Kaiann Drance, VP of Worldwide iPhone Product Marketing, along with VP of Camera Software Engineering Jon McCormack, and VP of Camera Hardware Engineering Graham Townsend.

The episode is pretty wide-ranging, touching on all of the new cameras featured across the iPhone 13 lineup. The conversation dives not only into the hardware of the new cameras but how the software optimizes the new components as well.

YouTube video

The Apple executives talk about how, since they own the entire stack from the photons to the finished photo, they are able to take full advantage of the hardware, firmware, and software that runs its cameras. It’s especially interesting to hear them dive into how they took what they learned from Portrait mode and, with the power of the A15 Bionic chip, built Cinematic mode.

“Since we own the entire stack, from photons to jpeg if you will, we can choose the optimal place in the pipeline to deliver specific benefits,” Townsend adds. For example, the Sensor Shift is powerful enough to stabilize a single second of video, with it helping provide the raw and accurate imaging data that the software team can expand on.

Machine learning has progressed considerably, especially with the amount of processing power the A15 now provides, according to McCormack. “This really speaks to the amount of processing power in the iPhone, and in fact we’ve got so much processing power now that we’re able to take these same computational photography techniques and introduce them in the video world to bring computational videography.”

“Really, we are now applying all of the same machine learning magic we learned in stills to video.” McCormack says the iPhone now “segments each frame in real-time, and we process the sky and the skin and foliage individually, and this takes our already industry-leading video and makes it even better by giving us better clarity and more detail in different parts of the image.

Check out the entire interview in the video above.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.