"When the lights come up, you think: 'Shit, Im actually inside a studio, not in the woods.' It's mad."
This was the wondrous, wide-eyed feedback from one of our creatives after their first experience using a volume screen on a virtual production stage. Once you've done it a few times, you get used to the experience, but the first time can scramble your brain a little.
Our latest foray into virtual production was for an ad we created towards the end of last year for National Energy Action. We needed it done fast, and, because it was for a charity, cost effectively. Two boxes that virtual production ticks. And two of the many reasons it has already revolutionised the way we work as an agency and is set to revolutionise this industry.
For clarity, we used real-time 3D engine Unreal Engine and virtual production to create a forest scene. A scene so realistic that, honestly, once the lights went down, you really did think you were in a forest. This hopefully explains the above reaction of our new team member.
Just in case you're not fully aware, virtual production is a seamless mix of live-action and digital/virtual sets that allow you to shoot multiple environments and variants at pace from within a single studio space. On a giant LED screen (not green) that can essentially become anything you want.
It can turn a two-week, multiple-location shoot into a two-day studio shoot, while also decarbonising the production process. What's more, every virtual set, background and asset is reusable. Again and again. And with a real-time 3D engine such as Unreal powering everything, all of these assets can be created virtually.
It's no wonder screens are popping up everywhere, and the film and TV community are getting on board. According to a report by global consultancy Altman Solon there are now 40 virtual production stages in the UK. In addition, 40% of respondents in the film and TV industry said they planned to use it in the next 18 months.
But the take-up seems slower in advertising. No research exists to find similar stats, but I would guess that the number is much lower. It really shouldn't be. We have found again and again that virtual production can be an answer to so many production issues. And here is why.
Realising the vision
Because virtual production spans the whole filmmaking process you can plan out scenes and shots with total precision, according to the exact creative vision of the director. However, these plans don't need to be set in stone. The flexible technology of virtual production allows for on-the-fly iteration and experimentation without posing a significant risk to either the production or budget, therefore allowing complete creative freedom.
Seeing the finished product in real time
By combining LED camera tracking technology and real-time 3D render engines such as Unreal, directors and actors are able to see what they're creating by playing back in real time and adjusting as they go. This means creators can finally shake off "fix it in post" and get it right on the day – cutting costs and time.
On that note...
By using virtual production techniques, you can budget with greater clarity. In the past, unforeseen elements such as weather, unavailability of location or a green screen effect failing to work as intended would mean the project would have exceeded its planned budget.
This clarity is due to a number of reasons
- The ability to adjust content during filming rather than in post-production.
- No need for expensive reshoots.
- No need for post-production rendering to achieve accurate lighting.
- A perfect replica of outdoor landscapes, meaning you're in control of the schedule, instead of natural light or weather.
Sustainability is going to be high on this year's agenda, so Unreal Engine and virtual production are going to be key to bringing your footprint down. There are no storage needs, a reduced need for travel to locations for shoots and the ability to re-use virtual sets and environments.
Building virtual worlds..
Virtual production allows natural settings to be replicated in the studio. It can also work in tandem with extended reality or XR technology to create a 360 environment that fully replicates the natural environment. LED screens can be virtually extended by interfacing the camera, giving scope for entire worlds to be created from a single-stage environment
...that encourage natural performances
When actors perform against an LED screen that's displaying an entire virtual world, they are able to respond to their surroundings as though they're real because they are immersed in that reality. There are other advantages to this natural sight lines to objects, scenery and events within the scene. This also allows the performers to react instinctively rather than to what they've been told will be happening on set.
Complete design freedom
Virtual production within an LED volume means there's no need to compromise on design for elements such as costumes, props and hairstyles, that can cause issues within green screens. This gives filmmakers and their teams complete freedom over how things appear on film with no need to budget for post-production fixes because of colour patterns caused by loose hair or green spill.
Glossary of keywords
As any parent of teenagers will tell you, new developments mean new language to learn and understand. So, to help you out, here are some key technical terms and their definitions.
Frustum: the region of a virtual world which appears as a viewport to the camera. On an LED volume, the inner frustum moves in sync with the camera, while the outer frustum is unseen by the camera and maintains the remainder of the environment static to provide consistent, realistic lighting. Typically, there is a buffer zone outside the inner frustum to accommodate latency issues between camera movement and real-time rendering.
Frustrum culling: the process of removing objects or reducing rendering quality for areas that lie outside the inner frustum, since they are not directly visible to the camera.
Global illumination: a method of virtual lighting, which achieves greater photorealism by simulating the indirect, bounced properties of physical light: a crossover between virtual and physical cinematography.
IES profile: a file format defined by the Illuminating Engineering Society that describes a light's distribution from a light source using real-world measured data.
Map: refers to a set environment within a real-time engine.
Universal scene description (USD): an open-source scene interchange and assembly scene format, created by Pixar and widely adopted in the visual effects industry.
Baked lighting: an asset with highlights and shadows baked into its surface texture, which does not directly respond to lighting changes. It is useful for increasing real-time render performance. See also interactive lighting.
Final pixel: the goal of achieving final image quality live, in-camera, without the need for additional major visual effects work.
Decimation: the reduction of geometry and texture to optimise an asset's real-time performance; a key difference between assets created for real-time versus post-production animation.
Camera calibration: the process of aligning a real-world camera to its virtual counterpart, essential for integration between live-action and virtual elements.
Stephen Barnes is founding partner and executive creative director at Collective