Ramji writes on the ‘Unreal Unity’ of technology and art…
The highly acclaimed Unreal and Unity3D engines are among the most popular tools with employed by the augmented Reality (AR), virtual reality (VR) and gaming professionals. But what in fact are these ‘engines’ and how is this new technology revolutionising cinema? In this article let us see what powers these new age solutions and how these technologies are changing filmmaking.
Imagine you are playing a computer game which is usually a set of sequences that appear at random, and you, the player, react or engage with them. All these happen in something called as ‘real-time’. In the computer graphics terminology, something happening in real-time means it happens instantaneously. When you are moving in the game or a VR environment, there is no way to predict what direction you would turn towards. And wherever you look within the game, there should be some visuals or environment with respective to your position. This is done by real-time rendering. Images or visuals that are produced instantly depending on the point of view. There are a lot of mathematical calculations that happen in milli or microseconds and the resultant images are shown to the user. These calculations and all other game dynamics are handled by the game engines.
Some of the popular engines right now are Unity3D and Unreal. It is interesting to see how these engines are evolving beyond the gaming industry. With realistic lighting, and almost realistic human character generators, these engines are blurring the lines between gaming and moviemaking.
For example, in the Disney+ series The Mandalorian, a novel idea called virtual production was used.
What is virtual production? This is a stage surrounded by a semi-circular LED screen on which the background or the environment is shown. The actors stand in front of the screen and enact their roles. All this while the camera records the scene with the background. This is very much like the background projections used in older movies. But the novel idea is that the backgrounds that are projected are dynamic, and the perspective will change as the camera moves. This makes the scene look realistic. And it also captures the ambient light from the background fall on the characters and the actors also know where they are located. This greatly helps in removing the usage of blue/green screen and reducing long postproduction hours.
This is how the real set and virtual set (LED Wall) is placed in the production floor. The part that is separated by the white outline is the real set with real people while the background is on the LED wall. They blend seamlessly thereby creating a continuous set.
The production team for The Mandalorian used Unreal engine to create the hyper-realistic backgrounds and these backgrounds can be changed dynamically during the filming. Using a virtual reality headset, the production team can alter the backgrounds as per the director’s vision. The real filming camera is linked to a virtual camera in Unreal engine and as the real camera moves or pans, the linked virtual camera mimics the movement thereby shifting the perspective of the (virtual) background. All these are done instantly and in “real-time”. This provides a very realistic shot, and the virtual sets can be quickly changed or altered in a jiffy!
Not only this, but there are also other dynamics like the time of the day that are made available to the filming team. They are provided by web-based controls on an iPad using REST APIs. This enables the production team to change the lighting, sky colour and time of the day all instantly. This saves a lot of time for the team and helps in improvising the shot or scene on the go.
Not the one to be left behind, Unity3D, is another popular engine that is in the fray of creating hyper-realistic movie-quality renders. They recently released a teaser called Enemies which involves completely computer-generated imagery complete with high-definition render pipeline (HDRP) for lighting, real-time hair dynamics, raytraced reflections, ambient occlusions, and global illumination. Well, these terms themselves will warrant a separate article. That’s for another day and time. Here, take a look at the teaser:
In this case, the entire shot is computer generated including the lady character. Unity3D has its own set of digital human models and Unreal has its Metahuman package that offer hyper-realistic digital characters which can be used in real-time.
This is just the tip of an iceberg. The possibilities are endless, and it is a perfect amalgamation of two fields, and this opens a lot of doors for improving filmmaking with real-time rendering technology and the line between gaming and filming are blurred by game changing technology revolutions driven by Unreal and Unity3D!
In case you missed:
- VFX – Dawn of the digital era
- VFX – The evolution
- VFX: The beginning
- Is Augmented Reality the future of retail?
- The future of training is ‘virtual’
- Putting the ‘Art’ in Artificial Intelligence!
- Into the Metaverse