Last month, the FuseBox joined forces with BRiGHT BLACK Productions for a FuseBox original Talking Tech event. Attendees took a dive into the ever-expanding and exciting world of virtual production. BRiGHT BLACK Productions is a playful media production company, specialising in virtual reality, AR, music and immersive theatre. Creators and directors of the production company consists of FuseBox residents, Simon Wilkinson and Myra Appanah. The creative duo hosted the FuseBox event, where they expanded on what virtual production is, why it’s used, essential components that make it possible and some of the challenges it presents, plus industry insights. In case you missed it, we’ve pulled together some of the best bits from this incredible event. 

What is Virtual Production?

Cinematography has seen first-hand the relationship between creativity and technology fuse together for decades. Through this partnership, virtual production was born, creating a space where physical and digital worlds could meet. 

Today, directors and creators are able to produce exciting and innovative virtual environments and characters. Though virtual production has been around for years, a pivotal moment for its development was the creation of James Cameron’s Avatar, the science-fiction blockbuster released in 2009. 

A decade later, the technology has only grown more advanced. The Mandalorian, a relatively recent TV series, adapted in the Star Wars franchise, is a prime example of virtual production today.  


    Real-time, virtual production tools from "The Mandalorian". Image sourced here.

“You'll notice the Mandalorian has a very nice shiny outfit, not like high gloss chrome where you might see your face in it because that would be a disaster. This stainless steel is shiny [using virtual production], so you get nice reflections and nice light. That's a large part of the reason why the Mandalorian was such a good fit for virtual production.”

To create such detailed and believable settings and characters, The Mandalorian uses a 70 foot long by 20 foot high LED wall, creating a curve for a huge space that a performer can work in. This is the physical set and a LED light source is used to create real-time reflections. Scenes are produced using a real-time engine, providing the background and the scenography, creating the illusion. 

  Simon Wilkinson at the FuseBox, presenting Talking Tech on Virtual Production

So, why use VP?

Production is expensive, there’s no doubt about it. However, BRiGHT BLACK creators were able to ‘hack’ the process that normally costs around £50,000 per day for equipment and expertise into a show that costs substantially less through virtual production. 

As well as being significantly more cost-effective than on-location filming, virtual production also provides more room for creativity in terms of lighting, backdrops, real-time and effects. The flexibility virtual production provides can also be more time-efficient and sustainable. 

“Rather than flying 50 people to a desert somewhere on another continent, you can go into a studio in London or nearby and you can hire a room with a bunch of people in it, who are very skilled.” 

Another advantage of virtual production is that what goes through the lens is the final pixel, rather than a green screen. When production shoots a green screen, there is a lot of room for error, i.e. if someone is wearing the wrong colour, bodies have been lost in the process (on film) and reflections can also appear. The shoot also has to be sent away to add the background and effects, slowing down the production timeline, creating a huge time lag. Final pixel is when you achieve a final image quality live, in-camera, without the need to adjust any major visual effects. 

“When you’re looking through the camera, you’re seeing real-time reflections and whatever lighting
you want to shoot in”. 

Virtual production essentials

  1. Real time engine 

Originally created using the powerful technology within the gaming industry, allowing games to be rendered in real-time, Unity is one of the most commonly-use real-time engine tools. 

“The whole challenge of that virtual art department is to render real time assets in the best quality as possible
for the wall to run.”

A real-time engine enables your project to be ‘live’ as soon as you begin production. This way, you can experiment and make changes to the lighting, characters, and camera positions as you work. Unity and some other platforms allow the creator to harness real-time 3D content. 

  1. Camera tracking 

Tracking devices enable the data from the real camera in terms of its position, its zoom and how focused it is. A wide aperture is all of that data going to the real time engine so it can be matched by a virtual camera.

“When the camera moves, so does the virtual camera and that way, when it focuses in and out, it does exactly the same on the virtual cameras and the real camera and therefore the background always matches.”

  1. Processing power 

When using Unity and/or other platforms such as Unreal to create a terrain in a virtual space, everything is going to be switched on. Everything needs to be able to switch on immediately to make beautiful spaces possible but this means processing power is essential as platforms and tools are being used to the max. 

“You want your framerate to be 25 frames per second,
that’s the minimum for the camera”.

There are a number of ways on a wall to manage the frame rate. To do this, the screen in the central portion which is called the frustum, tends to be at full resolution, whilst everything outside of that is a lower resolution. 

  1. Avoid moiré

A common interference pattern caused by the mismatch between sensors on a digital camera and a complex, repetitive pattern creates moiré. For example if a camera is focused directly on to an LED screen, the undesirable interference occurs. 

  1. Genlock graphics cards

To avoid additional interference and complications, Genlock is the process of synchronising pixel scanning of one or multiple displays to an external source. This process enables you to lock multiple graphics cards. For example when a camera is on a wall or at 25 frames per second, you want everything to run at the same time to avoid any visual disturbances. 

Technical as it may be, who else feels like they’ve learned a bit more about virtual production? At the FuseBox, it’s always exciting to expand our knowledge and this event has highlighted just how we’re still learning so much about this technology. A huge thanks to Simon and Myra for hosting and sharing their own expertise and experience. 

If you’re interested in attending some of our future events, including more Talking Tech events, find out more about and book your space here

Be the first one to leave a comment!
Sign in or join to post new comments