What I saw at Siggraph
Posted by Jon Peddie on August 14th 2017 |
Ever new and ever magical
This was my 38th Siggraph, and it was as exciting as the first one, well maybe not the first one, but pretty damn exciting nonetheless. Siggraph continues to attract imaginative and sometimes astonishing people and ideas, and has over the years maintained its fresh, “look at what I did” personality. That’s partially due to the constant influx (for 44 years now) of people who didn’t know it couldn’t be done, so they did it.
Almost 17,000 people came to Siggraph this year (up 21.4% from last year), along with 190 exhibitors (up 27% from last year). It’s impossible to summarize five days of exhibits and meets, so here’s the top of mind stuff. I sat on a bench with a farting elephant, had my mind read by a VR headset, watched people dance while painting in the air, saw raytracing happening in a under a second, met a rock-climbing award-winning photographer, a student who made a cardboard AR viewer, and listened to stories from the oldest animator in the world.
Anyone who missed hearing about AMD this week has been in a coma or on a desert island without WiFi. The company showed their long awaited 16- core Ryzen Threadripper 1950X CPU and 12-core 1920X processors, as well as the new Vega 64, and 56 graphics AIBs, putting AMD in a seriously competitive position with Intel and Nvidia. The stock market and Intel have already reacted to it. Want to see your hands and know where you are? Usens introduced SLAM technology to their hand-and-head tracking technology gesture measuring system. Combining 6DOF sensors with visual and IR sensors, the company can tell you and your HMD where you are, where you’re looking and where your hands are. This is clever, and tricky stuff, and deserves an article in its own part.
If you want an AR or VR HMD, it’s gotta be able to see. Qualcomm showed advances in image processing and AI for VR and AR (which they are calling Xr) HMDs. The company introduced its next generation of its Spectra ISP, which uses the Snapdragon 835’s DSP, some of the GPU, CPU, and specialized stateengines to manage the myriad of parameters associated with capturing, correcting, and stabilizing an image. It is being adopted by several HMD makers.
Intel was back at Siggraph and among the many things on display were its RealSense technology is used in virtual reality. The company made several software presentations on topics such as reducing Open Shading Language Render Times with a SIMD, Embree Ray-Tracing Kernels, OSPRay ray-tracing- rendering engine for high-fidelity rendering and visualization, and AI the next evolution for graphics and VR.
Light-field technology was in view and Avegant had their new HMD on display and spoke about how it, and light-field technology will change the future of AR & mixed reality.
I was never any good at dominos till I met a robot. Nvidia showed a semi smart robot with vision capability fumble at Dominos with visitors at the show—I think it was programmed to lose to make the participant feel good.
And I met Mike, literally and figuratively. Mike is a virtual talking head that is the result of a collaboration of seven companies worldwide. (and powered by NINE PC’s with 32GB RAM each and 1080ti Nvidia AIBs). Virtual Mike was rendered and presented at 90 fps in VR in stereo. MEETMIKE has about 440,000 triangles being rendered in real time, which means rendering of VR stereo about every 9 milliseconds, of those 75% are used for the hair. The face rig uses about 80 joints, mostly for the movement of the hair and facial hair. For the face mesh, there is only about 10 joints used- these are for jaw, eyes and the tongue, in order to add more an arc motion. These are in combination with around 750 blendshapes in the final version of the head. mesh.
The system uses complex traditional software design and three deep learning AI engines. Mike’s face uses a state of the art Technoprop’s Stereo head rig with IR computer vision cameras. Mike was a collaboration of teams from around the world, in four continents, three universities and six companies along with the Wikihuman global research project.
SIGGRAPH is for me the most exciting conference there is. My second choices are FMX, Laval VR, GTC, and GDC. Do you think I’m in a rut—a rut of pixels?
Previous entry: 2017 Multi-Monitor Market Study Usage and Trends