News

Ray tracing today

I first learned about ray tracing from Turner Whitted in 1980 and have been fascinated by and about it ever since. About eight years later I was working with a company called Meiko which was developing systems based on the Inmos Transputer. The transputer was an innovative and advanced 32-bit floating-point processor with four high-speed serial nodes, and it could ...

Jon Peddie

I first learned about ray tracing from Turner Whitted in 1980 and have been fascinated by and about it ever since. About eight years later I was working with a company called Meiko which was developing systems based on the Inmos Transputer. The transputer was an innovative and advanced 32-bit floating-point processor with four high-speed serial nodes, and it could produce 1.5 MFLOPS. Nodes could be connected to form various configurations, such as a hypercube supercomputer. One year we took 16 transputers and put them together and ran a ray tracing example with a 512 x 512 display. Capable of 24 MFLOPS we were able to get one frame a second (most of the time) and demonstrated that at Siggraph in 1989 in Dallas. I was astonished at the time and couldn't stop talking about it.

Fast-forward to 2015 at Nvidia's GTC when they demonstrated real-time (i.e., 30 FPS) ray tracing on an HD screen using a DG1 supercomputer. That computer produced 63 TFLOPS (2600 times the power of our Transputer experiment), and I speculated based on Moore's law that we could have real-time ray tracing in a PC by 2023-2024.

Extrapolating the steady gains in FLOPS, limited real-time raytracing could be available to consumer PCs by 2024 or sooner — Jon Peddie 2015

 

In 2018, at Siggraph Nvidia introduced their Turing architecture and demonstrated real-time ray tracing with 14 TFLOPS on a single graphics add-in board surprising the world, myself, and even people in Nvidia that they had gotten there so quickly.

For some, real-time ray tracing has been one of those things that will be great ten years now, people thought we'd never see it in our working lifetime, peace on earth and freedom from famine was expected sooner.

However real-time is a function of definition and the variable of several parameters such as screen resolution, color depth, scene complexity, and most important resolving time – when does it look good. All those variables can be manipulated to drive the claim of real-time ray tracing. There is no standard definition.

Demonstrations have been made in the past using hybrid rendering with a restricted number of bounces to obtain real-time or close to it ray tracing — with 25 FPS (equivalent to European TV) refresh rate being defined as the threshold of real-time.

Dedicated hardware also adds an important piece to the current puzzle as evidence by Nvidia’s RT cores in its latest Turing architecture. 

I saw demonstrations real-time ray tracing by SilconArts in Korea in 2014. They were using a custom ASIC the company had developed. Unfortunately, the company couldn’t find any customers or new funding and therefore dropped the project and moved on to other things.

In 2012, I saw real-time ray tracing using hybrid techniques in a view window by Imagination Technologies. Imagination Technologies had developed, partially through an acquisition, a dedicated ray tracing engine that the company was offering in the form of IP. So far, there have not been any public announcements of any companies using the technology, but Imagination Technologies still developing and promoting that engine and with good reason.

Ray tracing is not new, it has been enabled as a mainstream tool for visualization thanks to the evolution of processors and software advances including AI. It has become a fundamental component of real-time rendering and virtual production. One of the biggest breakthroughs to happen recently is the availability of real-time raytracing in the viewport for content creators. 

I see ray tracing being used in a pipeline, starting with conceptualization, into the design, then manufacturing, and finally marketing.

Since the design, the model, has been developed in 3D, the renderings of it are accurate to the design. That enables two benefits to the developers. First is it leads directly to virtual prototyping which is a genuine try it (fly it) before you buy it. Virtual prototyping or Pre-Viz allow the developer or director to make tweaks, adjustments, to realize their visualization, the thing they saw in their mind’s eye.

Increasingly production is a non-linear affair. Marketing has always started with the green light of a project. If managed correctly content created in the early stages of pre-viz may be used to develop marketing material earlier than usual. As a result, consumers are primed and ready to buy when the product is available. It’s estimated 95% of the automobile commercials are simulations and the cars being shown haven’t been built at the time of the viewing. The same is true of outtakes of movies and games, and other consumer products. That is certainly true of all space ventures and projects.

The pipeline diagram

 

In our report on the ray tracing market, we’ve identified 76 suppliers of ray tracing software, not including private developments from studios like Weta, DreamWorks, and Disney.

We have segmented the market into four categories: Integrated (e.g., Autodesk’s RayTracer), Stand-alone (e.g., V-Ray), plug-in (almost everyone), and middleware (e.g., Optix). As mentioned, we have identified 76 raytracing programs from integrated to stand-alone and plug-ins. Of that population, we have found 21 that are free – e.g., Blender’s Cycles and Eevee, POV-Ray, AMD ProRender, and others.

Ray tracing taxonomy 

 

In a few cases a supplier may offer a stand-alone, and integrated, and plug-in versions, or a supplier that offers a stand-alone and plug-ins, may have their program integrated into a modeling program. So just about every combination that can be imagined can be found.

As a result of all the interest, the number of suppliers, and the increased through-put in ray tracing; I foresee its adoption and usage increasing rapidly over the next five years.

Ray tracing usage as rendering solution over time

 

So, here’s the bottom line: ray tracing is the physically accurate photo-realistic representation of objects and is used throughout the concept, design, prototyping (including virtual), manufacturing, and marketing phases of most products, and certainly all consumer products, all animations, new cars, buildings, and fashion. We are at an important transition point for the technology. It is rapidly evolving from nice to have, to must have.