Ray Tracing Grows Across Industries – Despite Hardware Issues

Introduction to Ray Tracing

Ray tracing is a key feature creators can utilize to design and build immersive environments, movies, art or even real world simulations. It’s utilized across industries like film, architecture, gaming, product design, and even advertising. Ray tracing has grown exponentially over the past few years, and can be used to create things entirely impossible in the real world.

But most people have no idea about the process behind creating these digital assets even though it’s most industries’ go-to rendering technique. The film industry has shared the most about creating ray traced visuals and models, so we compiled some videos showing the process to get to a ray traced film.

Rendering a Ray Traced Film

Films like these aren’t rendered on a typical computer like yours or mine. Instead, visual effects (VFX) companies use facilities called render farms, essentially datacenters filled with thousands of CPU or GPU servers. Even with all that hardware, most studios still end up waiting over a year for their film to render.

Accessing more hardware alone isn’t enough to speed up the rendering process because creatives like to use that new hardware allocation to add features that wouldn’t have been feasible to render on older machines. Adding water and hair simulations greatly increase render time today as well as vapor-like elements like fire, smoke, or fog. Other pain points that also increase render time are:

  • Amount and complexity of geometry
  • Quality of textures
  • Amount and scale of simulations

Below is an example of how creatives have used the extra compute power to take fur and foliage from stylized elements to photorealism.

Disney | Pixar Up (2009) Photo from Pixar
Disney | Pixar Up (2009) Photo from Pixar
Disney The Lion King (2019) Photo from Technicolor
Disney The Lion King (2019) Photo from Technicolor

Power Issues

Not only does ray tracing take a ton of time, it also consumes a considerable amount of power – a small installation of 500 modern CPU servers can consume 550,000 watts when rendering. As electricity costs rise, we’re aware of the added cost of running this power hungry hardware. Unfortunately, most render farms have a combination of new and older, even less efficient CPU’s, so that number is likely much higher. Those older CPU’s are even slower and more power hungry than newer generations.

To help put power consumption in perspective, the chart below shows common household appliances compared to high-end GPUs:

Description Watts
2 ton Central AC
3,000
Electric Dryer
2,000
Dishwasher
1,500
Washing Machine
800
High-End GPU
600
Refrigerator
500
Power Consumption of Household Appliances

Power consumption is a bigger problem for films using extensive simulations and ray tracing like Avatar. We were already 3 years into development of a more energy efficient way to ray trace while improving performance when we heard about Avatar: The Way of Water outgrowing the power grid in Wellington, New Zealand. To finish rendering the film, they had to use three AWS cloud locations across the globe because they couldn’t physically expand the infrastructure of their data center before their render deadline. Bolt technology would have enabled them to cut their power consumption by half and finish renders twice as fast, significantly reducing costs and timelines.

Why do we keep Ray Tracing?

In spite of these issues, artists continue to use ray tracing to solve complex lightning and realism issues. Without ray tracing, the artist has to spend a lot of time faking lighting effects like shadows and reflections. This has lead to the quick adoption of ray tracing in other industries that require a higher level of realism: product advertising, architecture, and of course gaming. As these (and other) industries adopt ray tracing, artists and consumers find it hard to go back to legacy rasterization.

Performance Issues

Currently, adding ray tracing features in a real-time application (games, real-time renderers, metaverse experiences) lowers performance by more than half. A game without ray tracing enabled can achieve 60+ fps. The same game sees <30 fps when ray traced shadows and reflections are added.

We hear terms like “real-time ray tracing,” but know that it’s not there yet. Architects and designers can walk through their 3D models in “real-time” with their clients, but it’s not the same visual compared to the fully rendered video or image in their presentations – see comparison below.

Rendering Before and After Ray Tracing

Rendering without Ray Tracing Rendering with Ray Tracing

If there are all these issues with ray tracing, why don’t creators use something else? Before ray tracing, we had rasterization, another 3D rendering technique. Rasterization is faster than ray tracing because it’s accelerated in hardware instead of software like ray tracing. Effects like physically based materials, reflections, steam, fire – pretty much anything that interacts with light – must be faked with rasterization. Below is a comparison of rasterization and ray tracing on Brad – our film-grade mascot we had created to test out hardware. (We’ll be sharing a blog on the hardware issues we ran into making a short film with him later).

Rendering Before and After Ray Tracing

Rendering without Ray Tracing Rendering with Ray Tracing

Ideally, as ray tracing reduces the effort required by the artist to add realistic lighting, ray tracing performance should be at least comparable to rasterization. What we find is that many potential ray tracing users like the effects, but end up working around performance limitations with very creative tricks and hacks. These creative workarounds and hacks to make rasterization look like ray tracing can take months to develop.

So why do all these issues exist? Why do filmmakers have to physically expand data centers and why do creators have to invent workarounds in the first place? Why can’t they just use the ray tracing products marketed towards them in the first place

The common answer we received from users across various industries was they are limited by their hardware. What they’re trying to create takes thousands of server CPU’s or high end GPU’s that can cost more than $25,000 each. These GPUs have scaling limitations that make it difficult to render large scenes. Large studios have datacenters with tens of thousands of CPU cores, and it still takes a long time to render.

 

Bolt Solutions

Bolt’s market-leading ray tracing technology solves these problems, removing limitations on what creators can make. We realized that if you don’t have access to a render farm, it’s almost impossible to make these ray traced visuals. And if you do have access to one, it will take an immense amount of time, power, and money to render. Who knows, you might even outgrow your own power grid.

For more information on how our technology can help you design and build next-level experiences, products, and simulations, stay tuned!

Get Exclusive Updates

Subscribe to our newsletter for the latest news.