Being Transported by Amazon Studios Virtual Productions at Stage 15

Amazon Studios Virtual Productions (ASVP) is widely touted as the company’s cutting-edge benchmark in Hollywood filmmaking. Positioned to help reshape and assist in the production of the latest films and television series, Amazon Studios Virtual Production offers camera-to-cloud technological workflows.
I took a trip to the Amazon MGM Studios lot in Culver City, California. Here, a large LED wall (often known as The Volume) sits within Stage 15. Comprised of 3,000 LED panels and 100 motion-capture cameras, this 34,000-square-foot production space enables filmmakers to create backdrops and visuals, simulating on-site shooting locations. The production stage is flexible and malleable to fit the needs of any filmmaker and production crew. Rather than have to shoot on location and deal with weather, changes in natural lighting and have to rebuild sets, ASVP offers full control and manipulation in real-time.
Virtual Production has become more widely adopted since its advent in 2019. The first major production to utilize a Volume wall was Disney’s The Mandalorian. Stage 15 is 80 feet in diameter and 26 feet tall. Originally built in 1940, it’s been the home for productions such as RoboCop, Airplane, The Three Amigos, and Armageddon. Amazon re-opened Stage 15 for virtual production in 2022 where it’s been used for Prime Video originals such as Candy Cane Lane.

I’ve seen the Volume being used across many of today’s biggest films and shows. From The Lord of the Rings: Rings of Power to The Batman and Thor: Love and Thunder, I’ve seen the technology used to varying effects. When the production from top to bottom is well orchestrated and effective, Virtual Production can be seamless and realistic. If the lighting or backdrop assets are off slightly, it’s recognizable when watching in a theatre or from home. Coming to Stage 15 gave me a great amount of insight into how the technology works and the context of what makes a Virtual Production a great one.
Backdrops are generated using Unreal Engine 5. ASVP can create interactive and reactive sets for any production in a deep partnership with Fortnite publisher Epic. The Volume wall can be controlled as performers work. By design, actors can see their environments and can act and interact with the sets in real time. This offers inherent benefits, as opposed to green and blue screens where actors must take cues from the director and imagine their environments. In an on-stage demo, I was given a small flashlight prop that had sensors built into it. In front of me was a giant cave backdrop rendered onto the Volume wall. As if I was an actor on set, I could move the flashlight. The array of cameras across the stage read and reacted to the movement, casting a virtual light on the volume wall in real-time. The reactive technology stood as a great example of how immersive technology can giveaway to better performances from actors.
“We have a great relationship and a partnership with Epic,” Chris Del Conte, Global Head of VFX for Amazon Studios, says. “We’re able to say, ‘In the next version, could you do this?’ We have a director asking to do snap zooms and stuff like that. One of the projects we’re talking about was about the sizing of things or even Honey, I Shrunk the Kids. I can do that now and rescale the entire background, just make everybody miniature on a set. So even things that would take a long time and a long way to build practical sets, you can do a lot easier on a Volume wall.”

Every shot taken on Stage 15 is sent to the Amazon Web Services (AWS) S3 storage. Integral to the camera-to-cloud workflow, AWS is the backbone of many of Prime Video’s projects. Dailies taken from Stage 15 can be remotely accessed by creative teams worldwide. This enables a workflow that isn’t disrupted by lag time, location, or access. Editorial teams, VFX and post-production partners can access what’s shot and continue work or leave notes with very little delay.
“It allows us to create efficiencies in the production schedule,” Katrina King, Global Strategy Leader, told me, “If you think about it like this, how many times have we changed that backdrop today right? To be able to do that in the real world, we would have had 20 or 30 trucks driving around LA from one scene to another, disrupting traffic, disrupting local communities, and dumping tons of carbon in the atmosphere. Using virtual production, we can move from scene to scene to scene to scene. And not only does that create those efficiencies, as far as driving around town is concerned. It also creates efficiencies for the production schedule. So we can reduce the production schedule by producing more higher-quality content.”
AWS is also built on the use of generative AI. For teams working with Virtual Production and AWS, virtual backdrops can be theoretically generated using text prompts, assisting the art department. Film crews can also leverage AWS and generative AI to visualize where cameras and lights may have to go before arriving on set. Generative AI is expected to be progressively introduced over time as more AWS-powered productions begin to integrate it into their media supply chain. “We’re starting to see with generative AI is that we’re trying to promote use cases that are responsibly using it,” King says. “That allows artists to focus on creating and eliminating those mundane technical details from the role and freeing artists up to focus on what goes on the screen at the end of the day, rather than the mundane technical details of how they get there.”

One thing I never considered in Virtual Production was how actors and the crew may be affected by motion sickness while on the stage. Within moments of walking within the Stage 15 Volume wall enclosure, I began feeling a little light-headed. It was akin to the early side effects of using a VR headset. In a similar vein, as I began walking around the sets, taking in the stagnant virtual backdrops, it was disproportionate to a real set. I was not alone. “It happens a lot with the talent,” Del Conte said. “Especially when we get into driving shots when you’re moving that wall and have footage rolling. Eddie Murphy on Candy Cane Lane. We shot him driving around. All the driving scenes were on that wall. The first few times he got motion sickness. It’s kind of like when sitting on a train, the train next to you moves. We kept telling Eddie, look at the dashboard.”
Standing within the four walls of Stage 15, I got a greater appreciation of how crews can leverage Virtual Production and the Volume for more immersion. However, it’s still relatively early days for this budding technology. Directors and visionaries have yet to fully realize the potential of Virtual Production. “It means we’re going to start shooting in different ways, Del Conte states. “In the Francis Ford Coppola movie [Megalopolis], he shot a lot on the Volume. I’m dying to see what he did. I mean, he’s been an innovative director for his entire career, so I’m curious to see how he used the Volume wall and utilized it. It’s probably a different way than other people are doing. So every artist, every person out there, is taking this technology and embracing it for those that want to and pushing it.”
Amazon Studios is revving up for a big summer season as The Lord of the Rings: Rings of Power Season 2 begins streaming on Prime Video on August 29, 2024.
[Interviews have been edited for clarity and length]
Want to see more of our stories on Google?
P.S. Want to keep this site truly independent? Support us by buying us a beer, treating us to a coffee, or shopping through Amazon here. Links in this post are affiliate links, so we earn a tiny commission at no charge to you. Thanks for supporting independent Canadian media!
