If the PC is dead, how come so many companies are making money from it?
The PC industry has been exciting since the arrival of the Intel 4004 and Gordon Moore’s observation about the doubling of density in semiconductor memory manufacturing in 1965.
The industry has just reported on its 2016 third-quarter results, and what terrific results they were, for everyone. AMD let us know that PCs are not the only game in town being bolstered by console sales, even though they share the same X86 DNA with PCs. But regardless of AMD’s great results with consoles, more people use PCs for gaming, as measured in dollars and units.
Nvidia reported its best-ever quarter, with huge gains in PC gaming, strong sales in professional graphics, and increasing revenues bolstered by datacenter, automotive, and everybody’s favorite, deep learning and AI. Not huge yet, but Nvidia, in its diversification from the PC, has a commanding lead in those nascent markets.
The deep-learning software market is projected to surpass $10 billion by 2024, according to a report put out by Tractica.
Tractica’s analysis indicates that rapid advances in the field are now being spurred by three key market trends:
- Vast increases in the amount of data
- Significant improvements in machine-learning algorithms
- Exponential advances in hardware speed
Nvidia has a prestigious position in the supercomputer AI, DL arena now—their DGX-1 just entered the top echelon, with the machine hitting 4.9 PFLOPS to become the 26th fastest supercomputer in the world today. The important point here is that this is an Nvidia-built machine, not a university or government machine using Nvidia parts (see story, this issue).
Intel, AMD, and others (including Nvidia) can claim their processors are used in one or more of the top machines. Intel is enjoying that bragging right at the moment by being in the Chinese Tianhe-2, built with Xeon IvyBridge processors and three Xeon Phi processors.
Intel, of course, has a stake, and a historical one, in the emerging AI and DL markets. The company is taking clear aim at servers and AI/ ML/DL, where big data is one of the main drivers. The company is offering its Xeon Phi and FPGAs from their recent acquisition of Altera as an alternative to GPUs. And Intel’s long-standing partner and pal Microsoft has started rolling out Altera FPGA-based machines in its Project Catapult, the world’s largest deployment of FPGAs in the datacenter. It is designed to bring AI and DL capabilities to companies unable to afford a big machine of their own, or who have only one or two pro¬grams they need to run.
And everybody and their dog is giddy about IoT and all the data it’s going to spew out and thereby require lots and lots of servers and machine-learning algorithms to sort out the important stuff from all that data.
IoT is just a buzz word for connect sensors allowing their data to be easily accessed and analyzed. Those sensors include the kind that convert photons into electrons and can, with a little algorithmic magic, create an image of you being in the wrong place at the right time (proving you didn’t kill that cat), although heaven knows we could use a few less cats on the web.
Camera sensors are the largest population, and the most data-spewing sensor out there, clogging up our networks and filling up our servers. That, of course, is music to Dell’s ears, since they now offer the biggest suck-’em-up data machines in the world. And their customers like Microsoft and Amazon can’t seem to snap them up fast enough. We’ve heard a few three-letter agencies in the U.S., Europe, U.K., and Australia also have their checkbooks out. China and Russia make their own, from designs they, well …
So, the tone and meme about computing has shifted, and that’s a good thing. Today, and going forward, we will report on, and read about, the applications of these wonderful little processors we’re building, and not so much about the processors themselves; a philosophy and marketing program Qualcomm adopted years ago, to the consternation of component counters like us.
Our work (here at JPR central) will be a bit more challenging. We’ll have to look at a huge number of sources and tease out of them what components are in them. And, time permitting, if we do our job right, we’ll also be able to find out why a given component was chosen.
So not only is the “PC” market (which is a term long in need of changing) not dying, it’s getting more robust, more diversified, and more invisible as it burrows into every element of our lives. PCs aren’t just WIMPs (Windows Icons, Menu Point anymore); now they are big brother, little brother, and that annoying sister of yours.