Posted: By Jon Peddie 08.28.20
Scientific visualization differs from data (aka info) visualization in the quality and complexity of the graphics images produced. Data visualization is typically expressed in charts, whereas scientific visualization are images of events impossible to see with the human eye, glorious representations of products and possibly their destruction, and parts of the human body or the cosmos. Intel has been leveraging years of work and generations of technology to bring the worlds of data visualization and scientific visualization together.
Intel has been at the forefront of big data visualization for decades. Initially, the company saw scientific visualization as an ideal application to take advantage of all the GFLOPS its processors were capable of generating and that few apps were actually using.
Well, it turned out to be a case of be careful what you wish for. It didn’t take long for Intel’s data scientist and graphics software engineers to realize scientific visualization is literally and figuratively a black hole that can probably never be filled. A dream app for Intel, but also a big, big commitment to develop software tools to exploit Intel’s many cores. The company also learned very quickly that memory was as important as FLOPs and megapixels.
Happily, for the scientific community, Intel didn’t shrink from the challenges or escape to the comfort of crunching Fortran data. Intel may be the longest standing scientific visualization software company there is. All the others faded away or were acquired due to the costs of developing such software and the limited number of customers with budgets to afford it. Intel had the advantage of subsidizing its software efforts with CPU sales. As a result, the company has built an extensive software catalog of tools for visualization.
|Some of Intel’s visualization tools and libraries. (Source: Intel)|
And thank goodness they did because scientific visualization (sci-viz) is truly like an onion, and the more one gets involved the more layers you have to peel away, crying as you do, trying to get at the heart of the matter (again, literally and figuratively).
But boy-oh-boy, the things Intel has done and the astonishing images their customers have generated, and insights obtained—you could make a damn fine coffee table book with them.
Ray tracing has always been a component of sci-viz for a small group of users, but as the concepts of the digital twin began to take hold, the user base for viz has expanded into design and production areas. It has come out of the lab and has gotten the attention of the finicky special-effects people as they struggled under the weight of more and more data to make their waves splash, smoke curl, and bits blow apart. To wit Intel said, come on over—any platform, any domain, any challenge, we have you covered. Lurking behind the curtain was Intel’s newest weapon, a massive, scalable GPU—if you want to do viz, you need a GPU somewhere in the workflow.
But Intel sees the big picture, and it’s not just a GPU, or a massively indexed database, or an HDR 8K screen, it’s all that and from one end of the spectrum to the other, as the company illustrated.
|Intel’s wall-to-wall visualization span including ray tracing|
Intel has been offering Embree ray tracing intersection software and kernel libraries since 2012 and made it open source in 2013. A comprehensive ACM on Graphics paper on it was published in 2014 and can be found here. Embree is a major component of Intel’s visualization solution tools (including OSPRay), and recently the company has added AI denoising. Users can be found in animation studios, various scientific facilities, architectural design studios, and content creation in gaming and VR development.
One such example is Tangent Studio’s use of Blender in a project for Netflix productions. Tangent says they used Intel’s AI-based open image denoiser to decrease render time, and they like Embree because it adds predictability (which helped them stay on time and budget). Tangent claims it realized a 5-6x reduction in render time. That’s impressive because Tangent pushed the fidelity boundaries with Universal Scene Description (USD) support and Intel’s oneAPI rendering toolkit.
Tangent said they got a 10% overall decrease in render time, and up to 400% to 500% decrease on certain frames using Embree.
Chaos Group, one of, if not, the biggest suppliers of ray-tracing software, uses Embree and has a long list of movies their renderer was used in. So if studios find it acceptable, then you can assume it’s solid, fast, and affordable (like free).
Engineering data is, and has been, visualized using Intel’s visualization software for years. Intel brags and says its software is capable of interactive photorealism for manufacturing and customer tools. Something we didn’t know until Intel told us, is that most Bentley cars are configured and ordered on-line. Bentley has developed a user tool, they call Intelligent Car Configurator, and it processes 1.7M+ images with up to 10 billion possible configurations per model. Bentley said Intel OSPRay scales on Intel Xeon processors and delivers fast complex rendering with fewer iterations so they can achieve what they term “hyper-realism.” Also, the AI-based image inspection of digital craftmanship speeds turnaround for accurate configurator views.
The Bentley Virtual showroom consisted of 11 car models with billions of 3D primitives using over 120 GB of memory and this was a single scene or frame. Think of the situations where one wanted to show time-series animations of something similar. If it was animated at just 10 fps for 10 seconds, that would be >1.2 TBs of fast-access memory data for a TV commercial, something we take for granted. GPUs alone are simply not ready for that scale today. So Intel plans a platform rendering approach to enable the best of the CPU and GPU (or XPU as Intel calls it) for large scale visualization.
Encouraged by their success with one of the world’s most aspirational car companies, Intel has decided to roll out the OSPRay Studio. It will ride on Intel’s oneAPI and offer ray tracing and photorealistic rendering through an open-source OSPRay scene graph application. That will provide, says Intel, easy to use high fidelity ray-traced, interactive, realtime rendering. Intel also promises robust scientific visualization and photoreal rendering and says users can visualize multiple formats of 3D models and time series.
And, it’s cloud-based as well, if desired. Intel says one can execute applications and optimize visualization performance using its oneAPI rendering toolkit. You can, for example, visualize and iterate rendering with remote desktop capabilities, evaluate workloads (on the latest Intel hardware of course) and develop guided sample applications and workshops (that capability is coming soon).
You may have heard Intel’s slogan for its oneAPI—no transistor left behind. Well, Jim Jeffers, Intel’s Sr. Director, Sr. Principal Engineer, Advanced Rendering & Visualization, has added to it and he says, “No transistor, pixel, or developer left behind.”
Intel has also had several demos at Siggraph including one on its oneAPI rendering tool kit.