{"version":"1.0","provider_name":"Sify Technologies","provider_url":"http:\/\/www.sifytechnologies.com\/us","author_name":"Sify Technologies","author_url":"http:\/\/www.sifytechnologies.com\/us\/author\/adminsify\/","title":"How Unreal and Unity are changing filmmaking - Sify Technologies","type":"rich","width":600,"height":338,"html":"<blockquote class=\"wp-embedded-content\" data-secret=\"P2Fmc5hVjg\"><a href=\"http:\/\/www.sifytechnologies.com\/us\/blog\/how-unreal-and-unity-are-changing-filmmaking\/\">How Unreal and Unity are changing filmmaking<\/a><\/blockquote><iframe sandbox=\"allow-scripts\" security=\"restricted\" src=\"http:\/\/www.sifytechnologies.com\/us\/blog\/how-unreal-and-unity-are-changing-filmmaking\/embed\/#?secret=P2Fmc5hVjg\" width=\"600\" height=\"338\" title=\"&#8220;How Unreal and Unity are changing filmmaking&#8221; &#8212; Sify Technologies\" data-secret=\"P2Fmc5hVjg\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" class=\"wp-embedded-content\"><\/iframe><script type=\"text\/javascript\">\n\/* <![CDATA[ *\/\n\/*! This file is auto-generated *\/\n!function(d,l){\"use strict\";l.querySelector&&d.addEventListener&&\"undefined\"!=typeof URL&&(d.wp=d.wp||{},d.wp.receiveEmbedMessage||(d.wp.receiveEmbedMessage=function(e){var t=e.data;if((t||t.secret||t.message||t.value)&&!\/[^a-zA-Z0-9]\/.test(t.secret)){for(var s,r,n,a=l.querySelectorAll('iframe[data-secret=\"'+t.secret+'\"]'),o=l.querySelectorAll('blockquote[data-secret=\"'+t.secret+'\"]'),c=new RegExp(\"^https?:$\",\"i\"),i=0;i<o.length;i++)o[i].style.display=\"none\";for(i=0;i<a.length;i++)s=a[i],e.source===s.contentWindow&&(s.removeAttribute(\"style\"),\"height\"===t.message?(1e3<(r=parseInt(t.value,10))?r=1e3:~~r<200&&(r=200),s.height=r):\"link\"===t.message&&(r=new URL(s.getAttribute(\"src\")),n=new URL(t.value),c.test(n.protocol))&&n.host===r.host&&l.activeElement===s&&(d.top.location.href=t.value))}},d.addEventListener(\"message\",d.wp.receiveEmbedMessage,!1),l.addEventListener(\"DOMContentLoaded\",function(){for(var e,t,s=l.querySelectorAll(\"iframe.wp-embedded-content\"),r=0;r<s.length;r++)(t=(e=s[r]).getAttribute(\"data-secret\"))||(t=Math.random().toString(36).substring(2,12),e.src+=\"#?secret=\"+t,e.setAttribute(\"data-secret\",t)),e.contentWindow.postMessage({message:\"ready\",secret:t},\"*\")},!1)))}(window,document);\n\/\/# sourceURL=http:\/\/www.sifytechnologies.com\/us\/wp-includes\/js\/wp-embed.min.js\n\/* ]]> *\/\n<\/script>\n","thumbnail_url":"http:\/\/www.sifytechnologies.com\/us\/wp-content\/uploads\/2023\/02\/film_making_shooting.jpg","thumbnail_width":1200,"thumbnail_height":675,"description":"Imagine you are playing a computer game which is usually a set of sequences that appear at random, and you, the player, react or engage with them. All these happen in something called as \u2018real-time\u2019. In the computer graphics terminology, something happening in real-time means it happens instantaneously. When you are moving in the game or a VR environment, there is no way to predict what direction you would turn towards. And wherever you look within the game, there should be some visuals or environment with respective to your position. This is done by real-time rendering. Images or visuals that are produced instantly depending on the point of view. There are a lot of mathematical calculations that happen in milli or microseconds and the resultant images are shown to the user. These calculations and all other game dynamics are handled by the game engines. Some of the popular engines right now are Unity3D and Unreal. It is interesting to see how these engines are evolving beyond the gaming industry. With realistic lighting, and almost realistic human character generators, these engines are blurring the lines between gaming and moviemaking. For example, in the Disney+ series The Mandalorian, a novel idea called virtual production was used. \ufeff What is virtual production? This is a stage surrounded by a semi-circular LED screen on which the background or the environment is shown. The actors stand in front of the screen and enact their roles. All this while the camera records the scene with the background. This is very much like the background projections used in older movies. But the novel idea is that the backgrounds that are projected are dynamic, and the perspective will change as the camera moves. This makes the scene look realistic. And it also captures the ambient light from the background fall on the characters and the actors also know where they are located. This greatly helps in removing the usage of blue\/green screen and reducing long postproduction hours. This is how the real set and virtual set (LED Wall) is placed in the production floor. The part that is separated by the white outline is the real set with real people while the background is on the LED wall. They blend seamlessly thereby creating a continuous set. The production team for The Mandalorian used Unreal engine to create the hyper-realistic backgrounds and these backgrounds can be changed dynamically during the filming. Using a virtual reality headset, the production team can alter the backgrounds as per the director\u2019s vision. The real filming camera is linked to a virtual camera in Unreal engine and as the real camera moves or pans, the linked virtual camera mimics the movement thereby shifting the perspective of the (virtual) background. All these are done instantly and in \u201creal-time\u201d. This provides a very realistic shot, and the virtual sets can be quickly changed or altered in a jiffy! Not only this, but there are also other dynamics like the time of the day that are made available to the filming team. They are provided by web-based controls on an iPad using REST APIs. This enables the production team to change the lighting, sky colour and time of the day all instantly. This saves a lot of time for the team and helps in improvising the shot or scene on the go. Not the one to be left behind, Unity3D, is another popular engine that is in the fray of creating hyper-realistic movie-quality renders. They recently released a teaser called Enemies which involves completely computer-generated imagery complete with high-definition render pipeline (HDRP) for lighting, real-time hair dynamics, raytraced reflections, ambient occlusions, and global illumination. Well, these terms themselves will warrant a separate article. That\u2019s for another day and time. Here, take a look at the teaser: \ufeff In this case, the entire shot is computer generated including the lady character. Unity3D has its own set of digital human models and Unreal has its Metahuman package that offer hyper-realistic digital characters which can be used in real-time. This is just the tip of an iceberg. The possibilities are endless, and it is a perfect amalgamation of two fields, and this opens a lot of doors for improving filmmaking with real-time rendering technology and the line between gaming and filming are blurred by game changing technology revolutions driven by Unreal and Unity3D! &nbsp; In case you missed: VFX \u2013 Dawn of the digital era VFX \u2013 The evolution VFX: The beginning Is Augmented Reality the future of retail? The future of training is \u2018virtual\u2019 Putting the \u2018Art\u2019 in Artificial Intelligence! Into the Metaverse &nbsp; &nbsp; Ramji P S Ramji P S has about 20 years of experience in eLearning space and specializes in 3D modelling, AR, VR and MR solutions. He is a huge fan of Maestro Ilaiyaraaja\u2019s music. He enjoys mobile photography, reading, watching movies and web series."}