VR Technology in Cinematography

Upset by the appearance of new shooting and 3D rendering technologies, the cinema industry is changing. The increase in computing power, but also the use of real-time 3D engines and virtual reality headsets give filmmakers unprecedented agility during the shooting of films, relying more and more on visual effects. Behind these innovations, software publishers and chip designers are looking to get out of the game … and everything is accelerating.

 

 

It has become the most successful live-action remake of Disney classics. The Lion King , released on July 17, is the typical example of these films which concentrate the latest innovations. From preproduction to dark rooms, he benefited from the preview of scenes in 3D and in real time from the first sketches … but also from a filming facilitated using virtual reality when choosing shots by director. A case which testifies to the upheaval underway in this industry.

 

ACCELERATION OF COMPUTING TIME

The rise of graphics processors (GPU) is largely behind this breakthrough. Historically, computer generated images could only be calculated in specialized centers called “render farms”, which were mainly based on generalist processors (CPU). This process took hours and visual effects professionals were therefore subject to long delays between each preview of an image they were working on, which lengthened their workflows. But as technological progress has increased computing power, they have acquired new agility.

The progress of GPUs has made it possible to preview on workstations, and the latest graphics cards signed Nvidia , based on the Turing architecture , allow today to see a rendering close to final quality in real time. Their secret is a hybrid computing approach that combines traditional real-time rendering techniques with ray tracing. This rendering method simulates light rays to obtain better light effects or reflections in an image and requires very high computing power. It is systematically used in cinema scenes, but the power required until now made it very cumbersome for preview.

RAY TRACING IN REAL TIME BECOMES POSSIBLE

With Turing, Nvidia has dedicated part of these chips to the hardware acceleration of ray tracing. However, the quality remains unsatisfactory in the rough. The company has therefore combined deep learning techniques to improve by inference the quality of final rendering in a transparent manner. “Deep learning allows denoising [to remove image impurities, note] , upscaling [to increase image resolution, note] and anti-aliasing [to smooth out ‘image, editor’s note] . This will soon be the case for colorimetry, ” explains Nvidia.

 

 

 

 

 

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

4 + 2 =