The metaverse has been heavily hyped – but it could enable new ways to produce screens
Film production has been hit hard by the pandemic, with releases delayed and productions halted or canceled. One day we might even see Mission Impossible 7.
But, like your typical screen hero, it just might be the Metaverse to the rescue. Let’s explain.
What is the metaverse, why is it important?
Mark Zuckerberg, CEO of Meta (formerly Facebook) presents the metaverse as the future of human interaction where the alignment of virtual and augmented realities (VR and AR) allows us to work, rest and play via a second life virtual world accessible by a screen or superimposed (via special glasses) on the real world.
But how does this help us create our favorite screen content during a pandemic? Or the next global emergency?
what we can do now
Traditional production relies on the cast and crew being in the same place at the same time. The past two years have shown that there is a strong need to be able to either shoot movies where the cast/crew are in separate locations, or where the production space is partially or entirely in a virtual space (like the Lion King remake).
What we can do now, even with a nascent metaverse, is significant. Current tools of the trade include technologies such as deepfakes that use machine learning techniques to seamlessly stitch anyone in the world into a video or photo and production computer programs (such as Unreal Engine) who create places and avatars.
Disney studio The Volume, home to The Mandalorian, uses this latest technology to brilliant effect. In The Mandalorian, high-definition digital screens are attached to the walls and roof, providing the perfect background, perspective and lighting, using a mix of real and fully computer-generated imagery.
Working with the caveat that money is no object, here’s how these technologies can currently be deployed to tackle the two most pressing production issues in a post-COVID world.
Problem 1: the director in one place, the cast and crew in another
If this were the Lion King remake, director John Favreau could simply access the virtual environment remotely using his VR device from his home media room. For other productions, the director can interact with the actor via AR glasses that the actor puts on between takes to make it seamlessly appear that the director is in the room.
In this way, the function of the media room evolves, becoming a home communication center with a range of cameras and screens. It’s already happening and it’s something big tech is looking to accelerate. Products such as Microsoft Mesh for Teams are rapidly being rolled out, where mixed reality enables three-dimensional holographic interaction for meetings and collaboration.
Problem 2: The director, star, and co-star are all in different places.
Starting today, we can:
(a) film each actor separately with different crews in front of a green screen, then match the backgrounds (but the actors will have no interaction).
(b) use AR glasses for actors to see each other, then digitally remove them like Justice League did with Henry Cavill’s mustache.
(c) use two human stand-ins and use deepfake technology to alter their faces. This is useful if the actors need to touch each other.
However, all of them have drawbacks – or, in fact, the same drawback. The actor.
Until we can perfect both the realism of the person and the performance (just look at the brilliant but not good enough Luke Skywalker without Mark-Hamill in The Book of Boba Fett), the metaverse will never achieve everything. has fully realized its potential as a real alternative environment for the production of screens.
The latest iteration of young Luke Skywalker was generated from a combination of physical acting (not Mark Hamill) and deepfake technology. It looked physically perfect, but not when “Luke” started talking. This required most of the dialogue to be spoken off-camera. There was also a strong sense of the Uncanny Valley of Performance, originally named for the negative emotional response towards robots that seem “almost” human.
Forward to the future
The day of the perfect human avatars could be coming very soon. It was planned by novelist/futurist Michael Crichton – not in Westworld or Jurassic Park, but his obscure 1981 film Looker. The story is about technology that scans and animates actors, allowing them to retire and simply manage their image rights.
In this proposed near future, COVID is not a concern, nor is the death of an actor during production. All movies can be made like The Lion King, in a virtual environment.
Actors will move remotely from their media rooms to control their avatars, or maybe not. Going forward, Mark Hamill may have two prizes: one where he shows up, another where only his digital twin is used, one that can procedurally generate his performance by watching all of his films to determine the acting choices at imitate.
Just because we can, should we?
History shows us that new technologies are generally not used in bulk, and old technologies never completely die. Think vinyl. What is more likely is a certain reverse snobbery. Many shows will make full use of the metaverse, allowing them to continue running despite real-world calamities.
Perhaps a whole new hybrid genus will be formed. Films that might once have been animation can now be photorealistic – call them “live animations”.
But in a future where most of us eat lab-grown meat, only the best restaurants will still use live animals. The same probably goes for screen production: the ultimate prestige image will be done the old-fashioned way, real actors really acting against each other in real environments, pandemics and the metaverse will be doomed.