Apple Explains How iPhone 13’s Cinematic Mode Was Developed

In an exclusive interview with TechCrunch, Apple VP Kaiann Drance and Human Interface Team designer Johnnie Manzari have shared the details of how the new Cinematic mode in iPhone 13 was developed.

IMG 2465

“We knew that bringing a high-quality depth of field to video would be magnitudes more challenging [than Portrait Mode],” said Drance. He added that unlike photos, “video is designed to move as the person filming, including hand shake.”

Drance went on to explain that even higher-quality depth data was required so Cinematic Mode could work across subjects, people, pets and objects. “We needed that depth data continuously to keep up with every frame. Rendering these autofocus changes in real time is a heavy computational workload.”

Drance says that before development began, Apple’s design team spent time researching cinematography techniques for realistic focus transitions and optical characteristics.

“When you look at the design process,” says Manzari, “we begin with a deep reverence and respect for image and filmmaking through history. We’re fascinated with questions like what principles of image and filmmaking are timeless? What craft has endured culturally and why?”

You can read the lengthy interview in its entirety at the source page here.