For 30 years, SPEC’s SPECviewperf benchmark has been the go-to tool for comparing graphics performance across different systems—and it just got a major upgrade. The new SPECviewperf 15 now tests eight modern applications like Blender, Unreal Engine, and SolidWorks, measuring performance across OpenGL, DirectX, and Vulkan.
What’s really exciting? It now captures cutting-edge tech like advanced ray tracing, Unreal’s Nanite virtualized geometry, and GPU-accelerated rendering. The best part? You can still run it without expensive software licenses!
As AI and neural rendering reshape graphics, SPEC promises to keep evolving this trusted benchmark to help everyone—from individual buyers to major enterprises—make smarter hardware decisions.

New graphics technologies, from advanced ray tracing to virtualized geometry, offer exciting new capabilities for application developers, design engineers, game developers, and users—all of whom crave more powerful graphics capabilities. However, the increased complexity associated with these technologies has made it more difficult for hardware vendors and system buyers to compare the performance of graphics applications on differently configured computing systems.
Fortunately, SPEC, which is celebrating 30 years of providing the SPECviewperf benchmark to the industry, has delivered a new version that keeps pace with these technical advances, enabling unbiased, vendor-neutral comparisons of application performance on the latest generation of platforms.
History of the SPECviewperf benchmark
Three decades ago, the SPEC Graphics and Workstation Performance Group (SPEC/GWPG) developed SPECviewperf benchmark—then called Viewperf—the first benchmark to use real-world datasets (viewsets), tests, and weighting to provide consistent results across OpenGL implementations. It was also the first benchmark to be developed in cooperation with the independent software vendors (ISVs) creating the major graphics applications of the day, including PTC’s CDRS, IBM Data Explorer, Intergraph DesignReview, Alias Wavefront’s Advanced Visualizer, and the Lightscape Visualization System.
A key to the huge success of the benchmark was that it could be run without installing licenses for the represented applications, which meant vendors and system buyers could make accurate comparisons of hardware performance without the prohibitive cost of purchasing software licenses.
The SPECviewperf 15 benchmark
Over the years, the SPECviewperf benchmark has continued to evolve to provide the industry with a standard and consistent way of measuring graphics performance, keeping pace with evolving hardware, applications, and user requirements.
Today, eight modern graphics applications are represented in the SPECviewperf 15 benchmark: 3ds Max, Catia, Creo, Maya, SolidWorks, Unreal Engine, Blender, and Enscape, and the benchmark now measures the 3D graphics performance of systems running under the OpenGL, DirectX, and Vulkan application programming interfaces (APIs). The benchmark can still be run without installing licenses for the applications, and the diverse sets of modern workloads are easy to install and run, and provide high-quality, consistent results.

(Source: SPEC)

How individuals, enterprises, and vendors use the SPECviewperf 15 benchmark
The following are typical scenarios demonstrating the value of the benchmark among the various types of users.

The future of the SPECviewperf benchmark
The next wave of graphics technology innovation is on the way, as AI and machine learning (ML) will enable graphics that are more realistic and experiences that are more immersive. AI/ML will also make creation more accessible.
These new technologies include improvements in neural rendering and real-time graphics, such as:

These new technical capabilities will put more pressure on the performance of systems, which means more pressure on vendors, enterprises, and individuals to make informed choices about how and when to upgrade computing systems. SPEC is committed to continuing to update the SPECviewperf benchmark to help guide the decision-making of all these industry stakeholders.
Ross Cunniff is the chair of the Standard Performance Evaluation Corporation’s (SPEC) Graphics Performance Characterization committee. He has more than 40 years of experience in the tech industry, including nearly 25 years with Nvidia, where he serves as a systems software development manager.
Anthony Mansur is the vice chair of SPEC’s Graphics Performance Characterization Committee. He currently serves as a graphics software engineer with Intel, where he focuses on delivering performance and quality for professional workloads on Intel Arc Pro-series GPUs. He holds a master’s degree in computer graphics and game technology from the University of Pennsylvania.
WHAT DO YOU THINK? LIKE THIS STORY? TELL YOUR FRIENDS, TELL US.