Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


insideHPC Guide to AI and HPC in Media & Entertainment – Part 2

In this white paper sponsored by our friends over at Dell Technologies, we take a look at AI and HPC in Media & Entertainment where an increasing demand for digital media content creates need for faster rendering options.

The ongoing increase in the use of visual effects (VFX) and computer-generated imagery (CGI) and animation in digital content media creation for television, movies, and streaming services has accelerated the amount of material that requires rendering, from a workstation to a data center render farm environment. Demands for digital media content creation that requires rendering include: new digital resolution requirements, demand for new media content, and new realities in working conditions for employees who are working from home require new hardware and networking solutions to address poor or limited bandwidth in residential areas compared with in-office equipment.

This technology guide, insideHPC Guide to AI and HPC in Media & Entertainment, explores how the increasing demand for digital media content creates need for faster rendering options.

The Rendering Process

Making a movie or TV show these days is so much better than “Action!” and “Cut!”. However, in the highest  quality shows and movies, VFX and CGI requirements are steep, as audiences have matured and can tell the  difference between good and bad special effects. The success or failure of a film can often be tagged directly  to whether these effects looked real or fake.

Creating these effects requires many different process layers for a studio:

  • Pre-production includes storyboards, animatics and design.
  • Production includes departments such as layout, modeling, texturing, animation, visual effects, and lighting.
  • Post-production processes include compositing, visual effects and motion graphics, as well as color  correction.

In each of these areas, individual frames created via computer need to be rendered properly in order to  achieve the proper effect. Rendering engine software generates visual output to a screen or a file. The engine accepts input variables from data sources that include lighting, camera position, and geometrical data such  as 3D polygons. Rendering engines can run on CPUs with or without GPUs, and can also be run in parallel. Typically, they benefit from multicore, high-frequency clock speeds and large caches. Examples of rendering  engines include 3Delight, Autodesk’s Arnold, Maxwell Render and Pixar’s RenderMan.

Images, objects, or 3D scenes are created by either rasterization or ray tracing. With rasterization, the GPU  draws a 3D scene using two-dimensional polygons in a triangle mesh pattern. It is then rasterized by  converting the mesh into individual pixels on a 2D screen. A shader will then process the pixels by applying  texture, color, and lighting effects to produce a completed rendered image (for example, OpenGL or DirectX). Ray tracing is a realistic lighting technique that lights objects in a frame by emulating the path of light from a virtual source, showing reflections and refractions of objects along the image plane, just like real light would  interact with real-world objects.

Because rendering takes a long time, studios have created rendering farms, generally clusters of high- performance servers, networking and data storage, to speed up the time it takes to render a frame. Still, even  with render farms of hundreds or thousands of servers, the process can take a long time. When you then add rendering requirements for different processes (lighting, VFX, animation, etc.), or changes to the content  during the production process (scene cuts, new scenes, directors looking for better explosions, etc.), the time continues to add up.

Keeping a render farm up and running requires lots of horsepower for studios to keep their production  schedules on track. Many farms run 24×7—much more than typical enterprises where servers are run at about 25% capacity. Network bandwidth is also critical, as it enables workloads to be distributed, then pulled  together for the final production process. Faster rendering supports more advanced techniques such as ray  tracing, as well as creating more iterations in a shorter time frame. This gives teams the ability to polish their  work and catch mistakes that might be missed with slower rendering processes.

Over the next few weeks we will explore how the increasing demand for digital media content creates need for faster rendering options and how solutions from Dell can help streamline the process :

  • Introduction
  • The Rendering Process
  • Serving and Streaming Requirements, Requirements for Recommendation Engines, Challenges for Content Creators, New Demands Require New Gear

Download the complete insideHPC Guide to AI and HPC in Media & Entertainment, courtesy of Dell.

Leave a Comment

*

Resource Links: