TechWatch

Qualcomm enters AI cloud-based inference market

Arm-based device with DSP-based NPU and near-memory computing.

Jon Peddie
Snapdragon

Imagine Qualcomm, the phone-chip wizard, stepping onto the data center stage with a crown of efficiency! On October 27, 2025, they unveiled the AI200 and AI250—sleek inference engines ready in 2026 and 2027. Powered by the trusty Hexagon NPU from your pocket, these chips hug memory tight to slash power bills and speed up chatbots. Nvidia’s throne wobbles as Qualcomm’s shares soar 11%, and even the S&P 500 throws a party. Little Saudi start-up Humain grabs the first rack, whispering, “The future just got cooler.” Qualcomm announced its AI200 and AI250 accelerator chips for data center AI inference, with commercial
...

Enjoy full access with a TechWatch subscription!

TechWatch is the front line of JPR information gathering service, comprising current stories of interest to the graphics industry spanning the core areas of graphics hardware and software, workstations, gaming, and design.

A subscription to TechWatch includes 4 hours of consulting time to be used over the course of the subscription.

Already a subscriber? Login below

This content is restricted

Subscribe to TechWatch