WHY THIS MATTERS IN BRIEF
The content you see on your screens is just dots of light, and if those dots are created by games engines or AI’s rather than by filming real people will you know the different – or even care?
Interested in the Exponential Future? Join our XPotential Community, future proof yourself with courses from our XPotential Academy, connect, watch a keynote, or browse my blog.
Over the years I’ve been discussing how one day we’ll be able to create entire movies using nothing more than either Artificial Intelligence (AI) and Creative Machines that can imagine and then generate their own synthetic content, from art, books, games, and music, to imagery, movies and videos, and even digital humans and virtual influencers, and how dramatic improvements in game rendering engines, like the Unreal engine that I’ll discuss in a moment, will change how we create content forever with real world examples of where it’s already being used including Disney’s remake of the Lion King and The Mandalorian.
Last month, Epic Games took the covers off an early iteration of its latest amazing game engine, Unreal Engine 5, and pretty much broke the parts of the internet that are interested in gaming. Running on a developer version of the PlayStation 5 hardware, the results uploaded to Vimeo looked fabulous enough to begin with, but when you realised that this was realtime gameplay footage and not just for “demo purposes only” they were genuinely jaw dropping.
The visual quality is pretty much unmatched in anything apart from current high-end, pre-rendered VFX work and showcases incredible levels of detail alongside genuinely photorealistic lighting. And, of course, what’s even better, is that news of UE5 also lit up the parts of the internet that deal with broadcast graphics, film effects, virtual studios and pretty much everywhere else concerned with top quality visuals.
While other graphics engines such as Unity are, of course, available, Epic has pulled off the neat trick of combining a leading feature set with ease of use and a business model that encourages UE’s use in other software. For instance, any game developer that uses Unreal for commercial purposes doesn’t pay any license fees until the software’s gross revenue hits a $1m barrier and this has helped widespread integration of it into a whole host of technologies, and for non games developers it is 100 percent royalty free.
Its use throughout the graphics stack is analogous to the way that hooking up to the IT industry as a whole has accelerated development across the broadcast sector, it allows a comparatively small industry to piggyback on the development efforts of a much larger one and the speed of change we’re seeing as a result is impressive.
“When the new Unreal Engine 5 comes out you won’t be able to tell what’s real and what’s not,” comments Phil Ventre, VP Sports and broadcast at Ncam Technologies – and that’s a key statement, after all, who needs to send teams to remote parts of the world to shoot scenes, or hire actors to play parts if the synthetic “digital” equivalents become good enough?
“Games engine integration is democratising the way that companies use AR and VR and it’s not just going to be a technology for the Tier One broadcasters in the future,” he added.
While the new demo was partly created to show off some of the very clever new technology in the forthcoming PS5, such as its literally game-changing M.2 solid-state drive, UE5 is showcasing some new technologies that dramatically move the goalposts for realtime computer graphics work. There are two in particular worth mentioning: Nanite and Lumen.
Nanite is a new virtualized micropolygon geometry that essentially lets artists create as much geometric detail as they want. It is streamed and scaled in realtime, an important consideration when you’re planning an engine that will run on everything down to a smartphone.
Then there’s Lumen. This is a fully dynamic global illumination system that immediately reacts to scene and light changes, and is part of the secret of making game graphics look so good running in-console. It’s capable of rendering diffuse inter-reflection with infinite bounces and indirect specular reflections in what Epic calls “huge, detailed environments, at scales ranging from kilometres to millimetres.” It adapts too – blow a hole in a wall and the scene will change to accommodate the light coming through the hole.
“The ability to light and render hundreds of millions of polygons in real time is a quantum shift that will change the level of engagement filmmakers have with the images they create,” says Miles Perkins, business development manager at Epic Games. “These new technologies will allow creatives to see the totality of their vision without having to disassociate the various parts of their shots – reviewing animation separate from lighting, separate from environments and effects. Everything will be right there in front of them, fully directable. Filmmakers will be able to compose and light shots in real time, regardless of whether they are physical, virtual, or a combination of both.”
Perhaps one of the key points here is that Unreal Engine, currently on version 4.25, is already very good indeed.
“We are currently using Unreal Engine 4 heavily in both previs and virtual production,” says Hugh Macdonald, chief technology innovation officer at Nviz. “For previs, the real-time nature of UE4 means that we can get incredibly high quality pictures with minimal render time. We also use it for virtual production, giving better looking integration than we would historically have been able to.”
Nviz uses Unreal in two main tools, which are a good illustration of how it’s being used for previs work across the industry and where it could be going. A Virtual Camera System enables virtual scouting of the previs environment, and allows directors and cinematographers to have a hands-on experience with the camera, while a simulcam toolset is tightly integrated into Unreal, and provides a preview on-set to the production crew of what a shot will look like once the VFX has been added in post.
“Unreal allows us to ensure that this is both flexible and high-quality, says Macdonald. “Based on what we know so far about UE5, the major jumps are going to be around geometry detail, in that far higher resolution assets will be able to be used and streamed in. This will fit far better with a film VFX workflow, as the hope is that assets won’t need as much processing to make them engine-ready. The fully dynamic lighting that is Lumen will mean that there will be less need for baking the lighting to get the same result. This will allow us to keep scenes fully dynamic, allowing us to adjust the lighting live during production if required, which is often something that is asked for on set, as the physical lighting is changed depending on the shot.”
As well as an increase in quality, which NCam’s Ventre likens to the jump from UE3 to UE4, UE5 holds out the tantalising prospect of introducing both new ways of working and new ways of creatively exploring virtual spaces.
“Unreal Engine will be a big part of the future of cinematography,” says Sam Measure at CVP. “It’s bringing back the ability to get practical effects in camera, whether that be interactive lighting on an actor or dynamically changing the backgrounds in real time, even though they have been created in a virtual space. The ability to get instant feedback of how something is going to look is invaluable.”
This is going to feed into many more live spaces than onset previs. Macdonald mentions the theatre and event sector, where video screens have become part of the interplay with lighting to create whole new live spectacles, while there is going to be a further jump in quality on virtual sets to make them indistinguishable enough from the real thing so that that only a live audience will tip the decision towards using a physical set. Even those audiences might see quite different shows, with blended elements from AR being used seamlessly in the final TX. In the post Covid times, you will be absolutely able to assemble three guests on a sofa for a chat show without any of them leaving their homes and no one being the slightest bit the wiser.
And then of course, there is the way it could accelerate development of live shoots against LED screens as pioneered by shows such as The Mandalorian.
“While Unreal has been used on LED screens while filming a number of times, the new updates will hopefully allow this to be pushed much further, and get a higher proportion of finished shots directly from the camera,” enthuses Macdonald. And perhaps what is most impressive of all is that the UE5 footage to date is very early stages. UE5 isn’t due for full release until late 2021 while the PS5 isn’t due till at least the end of this current year. In other words there is a lot of optimisation and performance still to be wrung out of all this that is still to come.
“I believe we are just scratching the surface of what will be possible,” says Perkins. “Game engine technology will no longer be limited to individual film departments – all departments will be able to make contributions to the virtual sets just as they would on a physical set. In the future, the set will be a big creative sandbox, where the art department, set design, gaffers, cinematographers, visual effects, action designers, directors, will all contribute to the set both physically and virtually – bringing the best of their talents to bear in that moment when the film captures the image.”