Netflix Engineers Hack Face ID, Leverage Eye Tracking to Control iOS App [VIDEO]

Netflix recently held its annual Hack Day, where employees showed off their latest ‘hacks’ “that ranged from making improvements to the product, to enhancing our internal tools, to just having some fun.”

YouTube video

The coolest hack was leveraging Apple’s Face ID technology in its iPhone, to allow for eye tracking control and navigation of the Netflix iOS app. Based on where a user was looking, a yellow cursor would appear on the display, allowing for hands-free navigation. A facial gesture was also implemented to dismiss a screen, by way of sticking out your tongue.

Screenshot 2018 11 07 10 34 14

This ingenius hack—called Eye Nav—was created by Netflix employees and Senior iOS UI Engineers John Fox and Ben Hands, along with Senior Software Engineer, Steve Henderson:

Apple’s ARKit is a lot of fun to play with, and has enabled much-loved features like Animoji. We care a lot about Accessibility, so we’re eager to try a hack that would allow people to navigate the iOS app just by moving their eyes. The same technology that enables Face ID is great for accurately tracking eye position and facial expression. We used eye tracking to move the pointer around the screen, and measured the time spent on the same area to trigger the equivalent of a tap. We then used a facial gesture (tongue sticking out) to dismiss a screen. We’re hopeful that this kind of technology will become a part of mainstream Accessibility APIs in the future.

Check out the video below—it will be pretty cool for Accessibility if this is ever allowed by Apple in the future:

YouTube video

P.S. - Like our news? Support the site with a coffee/beer. Or shop with our Amazon link. We use affiliate links when possible--thank you for supporting independent media.