News

Nvidia claims realtime ray tracing now possible

Using AI techniques, and powerful processor

Jon Peddie

At GDC, Nvidia and Microsoft made a joint announcement about the possibility of doing real time ray tracing, something the CG community has striven for since ray tracing was popularized by Turner Whitted in 1979. Whitted did it on big computers at Bell Labs, just before the PC was introduced. Since then, scores of papers and books have been written about ray tracing. And although the basic ray tracing equation is eloquently straight forward, because of the density of rays (originating and reflected) in a given image it takes an enormous amount of computing power to render a scene. As a result, organizations that use, and need, ray tracing have either used farms of processors to reduce the time, and/or employed clever tricks to only ray trace important elements within a scene. Still, even with the tricks one frame of a major movie can take two to six hours to render. Advertising agencies, architectural firms, and design studios commit days to generate renderings.

Moore’s law has been nibbling away at the problem, and every year new algorithmic tricks are introduced (Real time raytracing on your phone, TechWatch 1/23/2018). Nvidia introduced a clever use of AI last year at GTC to predict the final image of a scene and speed up the rendering time (AI for ray tracing, TechWatch 5/16/2017, p.4). At that same conference, Nvidia introduced its powerhouse GPU-based AI processor, Volta. Volta is a massive array of over 5000 GPU shaders plus an additional 640 cores in an AI accelerator the company calls Tensor cores. Volta is not a commercial or consumer GPU like the GTX series, but rather a dedicated AI processor.

Pure processing power will get you so far, but to really improve efficiency in ray tracing you need efficient APIs drivers, and applications. Microsoft recognized that, as well as the advancement in processors, so in anticipation of the increased use of ray tracing in game and other professional applications has introduced DXR, a DirectX extension for ray tracing.

Nvidia worked with Microsoft on developing DXR and took it a step further by introducing a low-level interface to their Volta processor they are calling RTX.

Nvidia’s RTX sits below Microsoft’s DXR API

 

RTX technology includes algorithms & GPU. Nvidia has also expanded its ray tracing software tools to work with DXR and Volta.

Nvidia’s Gameworks tool has been expanded to incorporate ray tracing.

 

RTX is for the Volta processor only. Any processor that can access DirectX can use DXR. Nvidia however says with the addition of their RTX and Volta, one will now be able to render ray tracing frames in 16 ms. The fine print will be the key.

Some developers already have access to the tools and beta API and demonstrated early work at GDC.

Nvidia’s use of AI in ray tracing is referred to by the company as an AI -based de-noiser, and was added to Optix last year, RTX will leverage that GPU based de-noising. RTX takes advantage of simultaneous compute and graphics (aka Async compute) among other things.

Volta, RTX, and DXR are all more tools in the quest for the holy grail of real time ray tracing. We will still see clever tricks like hybrid ray tracing for shadows, reflections. And although Volta makes RTX faster, and will use the AI engine, Nvidia isn’t disclosing any details now, probably waiting for their GTC to lift the veil.

At Microsoft’s GDC announcement, the first wave of supporters were revealed. More will follow at GTC.

 

It will take a few weeks and the passage of GTC to fully digest and understand all that DXR and RTX will deliver.

You can expect to see Khronos also offer ray tracing extensions to Vulkan as well, probably at or just after GTC. Obviously Nvidia wants as broad a platform as possible in order to get a return on the massive investment they have made in both ray tracing technology and Volta.

Read all about

Microsoft has published a white paper on ray tracing and DXR at their blog site: https://blogs.msdn.microsoft.com/directx/.  In addition to a nice tutorial on ray tracing, Microsoft says DXR will initially be used to supplement current rendering techniques such as screen space reflections, for example, to fill in data from geometry that’s either occluded or off-screen.  This will lead to a material increase in visual quality for these effects in the near future. Over the next several years, however, the company expects an increase in utilization of DXR for techniques that are simply impractical for rasterization, such as true global illumination.  Eventually, ray tracing may completely replace rasterization as the standard algorithm for rendering 3D scenes. That said, until everyone has a light-field display on their desk, rasterization will continue to be an excellent match for the common case of rendering content to a flat grid of square pixels, supplemented by ray tracing for true 3D effects.

Microsoft is introducing a new tool called PIX. PIX on Windows supports capturing and analyzing frames built using DXR to help developers understand how DXR interacts with the hardware. Developers can inspect API calls, view pipeline resources that contribute to the ray tracing work, see contents of state objects, and visualize acceleration structures. This provides the information developers need to build great experiences using DXR.

Microsoft’s PIX

 

In addition, while this marks the first public announcement of DirectX Ray Tracing, the company said it has been working closely with hardware vendors and industry developers for nearly a year to design and tune the API.  In fact, a significant number of studios and engines are already planning to integrate DXR support into their games and engines.

Developers can use currently in-market hardware to get started on DirectX Ray Tracing.  There is also a fallback layer which will allow developers to start experimenting with DirectX Ray Tracing that does not require any specific hardware support.

Microsoft believes DirectX Ray Tracing will bring ray tracing within reach of real time use cases, since it comes with dedicated hardware acceleration and can be integrated seamlessly with existing DirectX 12 content.

This means that it’s now possible for developers to build games that use rasterization for some of its rendering and ray tracing to be used for the rest. For example, developers can build a game where much of the content is generated with rasterization, but DirectX Ray Tracing calculates the shadows or reflections, helping out in areas where rasterization is lacking.

Epilog

AMD says it has been working closely with Microsoft to support DirectX Raytracing and has been investing in ray tracing technology for some years. The company says they are eager to support Microsoft in widening its availability. Developers can start researching on AMD hardware right now using Microsoft’s fallback layer, and the company will add native DirectX Raytracing driver support in the near future. AMD is also working with other industry standard bodies and partners to enable ray tracing technologies. As with any graphics technologies, we are on a trajectory to evolve our GPU architecture to continue improving and optimizing the graphics applications.