Epic Games Uses iPhone to Demo New Motion-Capture Animation Tech

During its State of Unreal presentation, Epic Games demonstrated its latest machine-learning tech which generated impressively real facial animation within minutes from a video shot on iPhone.


The speed and fidelity of the result drew a huge round of applause from the developers gathered at the Yerba Buena Center for the Arts.

According to ArsTechnica, Epic’s upcoming MetaHuman facial animation tool could revolutionize motion-capture and similar time-intensive workflows.

Such high-end results from this kind of basic input “has literally changed how [testers] work or the kind of work they can take on,” said Epic VP Vladimir Mastilovic.

Even realistic tongue movements are extrapolated from the captured audio, using an “audio to tongue” algorithm that “is what it sounds like,” Mastilovic added.

Epic also showed how all those facial tics could be applied to any model built on the same MetaHuman standard.

Watch the following demo in which Melina Juergens’ performance transforms into a stunningly accurate 3D animation in just minutes.

YouTube video

And here’s a performance-focused trailer that hasn’t been polished or edited in any way and took a MetaHuman animator just minutes to process, start to finish.

Don’t forget to share your thoughts with us in the comments section below.

YouTube video

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.