Once a humble graphics chip, the GPU has become technology’s most versatile workhorse. Born from the 1987 VGA controller’s pixel-pushing dreams, it evolved through gaming’s demands for ever-prettier pixels. Like a mighty river absorbing tributaries, the GPU assimilated everything: DSP vector engines, Google’s TPU matrix magic, smartphone AI accelerators—all swallowed whole. Today’s monsters pack 4 trillion transistors and power everything from your phone to data-center behemoths. The question isn’t whether CPUs will be absorbed next—history says they will—but whether we’re heading toward Cerebras-style mega-chips or AMD’s chiplet constellations. Either way, the GPU’s appetite remains insatiable.

Resistance is futile; you will be assimilated.
As the mighty GPU powers more of our lives—from driving our cars, powering our connection to the world through smartphones, entertaining us with amazing simulations in games, and consuming all available power for data centers—a moment of reflection on how this ubiquitous equivalent to a technological weed has taken over our lives seems appropriate.
The idea of manipulating individual pixels emerged with the VGA controller in 1987. That gave rise to the adoption of serious graphics in games. That, in turn, drove the demand for more control, higher resolutions, and more colors, leading to the GPU with multiple pipelines, soon to be renamed “shaders.” The size and quantity of shaders expanded as fast as Moore’s Law would allow (it was, after all, the law).
The idea of a dedicated graphics processor spread to other platforms, and stand-alone graphics co-processors were finding homes in mobile phones. The concept of the SoC, enabled by Moore’s Law alongside the GPU, is on a path of assimilation and technology sharing. SIMD gave way to SIMT, and the abandonment of color LUTs in favor of unified programmable shaders in the early 2000s.
Simultaneously, DSPs, which had been esoteric music and image engines, added vector capabilities, enabling them to perform matrix math, and were then assimilated into what would become the NPU.
Goggle produced the first dedicated matrix engine for AI with their TPU in the mid 2010s, which was assimilated into GPUs by 2017.
In the late mid-2010s, the first dedicated AI engines began to appear as video processors (VPUs), and within three years, they were assimilated into smartphone SoCs, but they did establish the beginning of dedicated AI processors.
Figure 1. The evolution of the GPU.
Like the Mississippi, Volga, Yangtze, and the Nile, technology has flowed into the GPU over its 27-year history. And although physical VGA sequencers have given way to GOP under UEFI (in 2011), they still have a heritage going back to 1987.
Today, we have powerful x86 and Arm CPUs with integrated GPUs that are entering the AI market in the form of the AI PC. At the same time, we have giant GPUs being tightly coupled with Arm, RISC-V, as well as x86 processors.
It seems that if history is any teacher, it’s just a (short) matter of time until the Arm or RISC-V CPU is assimilated into the mighty GPU. It’s a matter of size and memory. The industry is at a crossroads of ever-larger integrated systems like Cerebras and Nvidia versus the pursuit of chiplets sitting on a bed of interposers like AMD and Intel. Hybrid approaches are already being tested, along with the integration of area-hungry SRAM to try to break the memory bandwidth barrier.
We have 40 years of assimilation and integration to help us predict the future. You could mark T-0 with the integration of the FPU in the 80386 in 1985, or maybe the GPU into an iGPU in the Westmere architecture in 2010. And if Cerebras is an example, there doesn’t seem to be an asymptote on how far assimilation can go.
Resistance IS futile, you WILL be assimilated.

The era of AI processors is now upon us, and we have tracked and captured it since 2016. You can see who’s who, what they make, how much money they’ve raised, how big the market is and what’s forecasted, and much more in our 2026 AI processor report with its companion database.
WHAT DO YOU THINK? GOOD ENOUGH TO TELL YOUR FRIENDS ABOUT IT? YOU HAVE OUR PERMISSION TO SHARE IT.