GPU

The big potential of training and education games

Source: Gerd Altmann from Pixabay   The global video game industry has many game types, genres, platforms, business, and use models. There are so many permeations and combinations that people often disagree about what is what. There is even debate about the definition of what constitutes a video game. I have always viewed simulation and interactive training/education as part of … Read more

AI accelerators and open software transform the computing landscape

Three years ago, we had maybe six or less AI accelerators, today there’s over two dozen, and more are coming. One of the first commercially available AI training accelerators was the GPU, and the undisputed leader of that segment was Nvidia. Nvidia was already preeminent in machine learning (ML) and deep learning (DL) applications and adding neural net acceleration was … Read more

SiliconArts new ray tracing chip and IP

Founded in 2010 in Seoul by Dr. Hyung Min Yoon, formerly at Samsung, Hee-Jin Shin from LG, Byoung Ok Lee from MtekVision, and Woo Chan Park from Sejong University, SiliconArts took on the formidable task of designing and manufacturing a ray tracing hardware accelerator co-processor, which they called RayCore. The company showed its first implementation in an FPGA in 2014, … Read more

Intel’s Gen 11 GPU

Intel is bragging about their low-power consuming 10 nm, Gen 11 integrated GPUs in the new Ice Lake processors. And they should brag, look at all the stuff they’ve crammed into the Ice Lake. The company announced its Gen 11 integrated GPU in early August and now the company is shipping the Core processors that have the new GPU design. … Read more

Famous Graphics Chips: Matrox MGA

  Dorval Canada-based Matrox is the oldest continuously operating graphics add-in board company in the world — they started in 1979 before IBM introduced the PC. Matrox's first AIB was the ALT-256 for S-100 bus computers, released in 1978. ATI started seven years later (also in Canada) and eight years after that Nvidia started. Hercules developed their AIB in 1982, … Read more

Cerebras reveals world’s ‘largest computer chip’ for AI tasks

At Hot Chips, Californian-based Cerebras Systems showed the world's most massive computer chip, the Wafer Scale Engine, that is slightly bigger than a standard iPad. The firm says a single chip can drive complex AI systems in everything from driverless cars to surveillance software. Started in 2016, by CEO Andrew Feldman and Sean Lie who previously founded SeaMicro (that AMD … Read more

Famous Graphics Chips: Nvidia’s RIVA 128

It wasn’t an April fool’s joke in 1997 when Nvidia released the RIVA 128 on April 1, based on the NV3 media accelerator. However, it was almost the company’s last gasp. The story begins in 1993 when Nvidia was founded in Sunnyvale. The company’s founders had a novel idea for a graphics accelerator based on quadratic surfaces, a significant departure … Read more

The end of seasonality

  The world as we know it is over. Dead, done, gone. We no longer have anything to count on, or a method with which to count.  This is what the world used to look like. The direction has been a little depressing but the patterns have been set, we know what’s in store, year after year. Except for this … Read more

Think Silicon secures investment round

Think Silicon based in Patras, Greece, was founded by George Sidiropoulos and Iakovos Stamoulis in 2007 with the vision to supply configurable IP semiconductor modules for complex SoCs. The company went on to develop IP for a line of low-power, powerful GPUs, and has developed a following over the years.  Last week Metavallon VC announced they are leading the first … Read more

Intel goes steampunk with new GPU AIB

As you know I am a very close follower of anything that moves or mangles a pixel, and over the years Intel has mangled lots of pixels. I’ve been closely monitoring all leaks, tweaks, tweets, and feats from and about Intel’s dGPU efforts sometimes referred to as Xe and sometimes called Odd-oh-see.  Recently while visiting Intel I managed to, ah, … Read more