News

AR, MR, XR, VR… choose your letter at CES 2024

Devices from Sony, Vuzix, Ultraleap, and Brelyon.

David Harold

CES 2024 showcased major XR developments, including Sony’s new XR headset, RayNeo’s smart glasses, Vuzix’s AR glasses, Ultraleap’s hand-tracking system, Brelyon’s immersive displays, and Apple’s Vision Pro shipping date. These devices look to offer impressive features and advancements in the XR space and show the wide variety of form factors appearing.

What do we think? We’ve already covered major XR developments recently from Xreal and Qualcomm, and giving a separate story to every XR device at CES 2024 would need a book. There were devices for mental health, blindness, and even calming astronauts. There were add-ons for smartphones and opportunities to try devices that have been out for a while. Smart glasses are establishing themselves as a form factor. And while Apple doesn’t officially do CES, they chose the show, the busiest time of the year for tech news, to announce that Apple Vision Pro will finally ship on February 2 in the US.

CES XR roundup
Sony

Sony’s press conference featured a surprise new XR headset based on Qualcomm’s new XR2+ Gen 2 chip and running Android. The launch was splashy, with hundreds of media in attendance, but to get a demo of the impressive headset behind closed doors was a scrum.

Sony says the headset, which offers extremely good image quality, is an 8K device, by which it means there are two 4K OLED displays, one for each eye. The display has video pass-through, but you can also quickly flip it up to see the real world in full. An oddly-shaped controller and a ring are worn on either hand, making it fine for lefties like me.

Sony VR controllers
Sony’s new controllers. (Source: Siemens)

The project was created with Siemens, and the demos had a focus on industrial and design applications. That said, the image quality is such that it could form the basis for an impressive video entertainment device as well as used by VFX and gaming creatives. While initial units will be sold only with Siemens NX Immersive Designer, which requires a stand-alone Windows laptop to run, we heard off the record from a related party that the platform will have a more consumer-friendly variant to be announced that will not be tethered.

Sony’s PS VR2 platform used custom silicon from MediaTek, but this new device moves to Qualcomm because the spatial computing requires separate AI processing that MediaTek is not yet able to deliver. The partnership between Qualcomm and Sony on hardware seems strong and stretches also to automotive IVI and handheld gaming.

RayNeo

RayNeo announced the X2 Lite, a Qualcomm-based smart-glasses device, as well as an upcoming crowdfunding campaign for the regular RayNeo X2, a Qualcomm-based device.

Ray Neo Glasses
(Source: RayNeo)

According to the company, the RayNeo X2 Lite, full-color 3D display glasses with the Snapdragon AR1 Gen 1 Platform, has AI capabilities for virtual assistant applications, real-time translation, live captions, and 3D navigation assistance. Weighing 60 g, the RayNeo X2 Lite is one of the very light full-color AR glasses and uses MicroLED optical waveguide display technology to deliver 3D visuals on the slim, transparent lenses, offering a 30-degree field of view. A pair of compact full-color AR glasses projectors fit on each side of the frame, delivering to-eye brightness of up to 1,500 nits, with adaption to various lighting conditions, including harsh sunlight. There’s a 12MP camera for invading people’s privacy.

Vuzix

In addition to displaying its new CES 2024 Innovation Awards-winning Vuzix Ultralite S AR smart glasses OEM design, Vuzix demonstrated its M series, Vuzix Shield, and forthcoming AI optimization wearable at its booth. Aimed in part at sports and fitness users, these glasses boast a sleek design that delivers hands-free, wireless connectivity to information from the wearer’s smartphone or smartwatch. Vuzix Ultralite S also employs Vuzix Incognito technology, which the company says virtually eliminates the eye glow or forward light found with other waveguide-based solutions. Vuzix was joined by Japanese telecommunications giant and Vuzix partner NTT QONOQ (a subsidiary of NTT DOCOMO) to show how their mixed reality solution, NTT XR Real, is enabling the connected workforce in Japan.

Vuzix glasses
(Source: Vuzix)

“Whether for enterprise, medical, defense, or, ultimately, consumers, our focus is on offering the solutions needed to optimize the workplace and connect workers with technologies to make them more effective in what they do,” said Paul Travers, president and CEO of Vuzix.

Ultraleap

Ultraleap announced that it’s working with Prophesee to develop what it says is the world’s first system using event sensors and hand tracking in AR devices. This is proposed as a solution to the unsolved challenge of AR devices that are always on, worn everywhere, and require a higher level of interaction. The company says it uses Ultraleap’s advanced computer vision and machine learning models together with Prophesee’s GenX320 event-based Metavision sensors. Power consumption is said to be a fraction of what’s used in existing models and enables an always-on AR device that can track the user’s hand interactions natively for longer. The company also says its Ultraleap models for event sensors are also more robust to adverse light conditions and have lower latency. Furthermore, Ultraleap claims that all of these features are provided with increased privacy, as event sensors are not frame-based and capture only pixel movement information. 

While not cited in the press release, Qualcomm told us they continue to work with Ultraleap hand-tracking technologies too.

Brelyon

Does AR mean just headsets? Not according to MIT spinout Brelyon, which unveiled a series of new displays—including the next generation of its flagship technology Brelyon Ultra Reality—which the company claims will enable new possibilities for experiencing immersive displays in the car, at work, or at play.

“Immersion doesn’t have to solely equate to headsets,” said Barmak Heshmat, CEO of Brelyon. “We are seeing rapidly growing demand for immersive applications outside wearable displays.” You can hear a lot more from Heshmat here.

What’s the pitch? They say they have made significant advances to optical depth perception. With multi-depth, Brelyon has developed a new approach to displays—using machine learning, time-folded optics, and computational wavefront engineering designed to work with the physiology of the human eye—to solve a long-standing challenge of depth perception.

Brelyon Ultra Reality production model (left) and the Ultra Reality Min
Brelyon Ultra Reality production model (left) and the Ultra Reality Min (right). (Source: Brelyon)

Computational wavefront optics typically employ a spatial light modulator, a hardware deconvolution approach for high-resolution adaptive optics systems.

Apple

Apple chose the week of CES, when lots of competitive products were in the news, to announce its Apple Vision Pro will be available beginning Friday, February 2, at all US Apple Store locations and the US Apple Store online.

Vision Pro’s presence could be felt all over CES. Although Qualcomm was keen to stress to me that they have been using the term spatial computing for five years, there is no mistaking how the conversation has changed post-Apple’s announcement. While it’s not as draconian as Apple’s ban on Vision Pro apps calling themselves AR/VR or XR, the industry is self-policing a shift away from such terms and toward spatial computing for everything.

An all-new App Store provides users with access to more than 1 million compatible apps across iOS and iPadOS, as well as new experiences that take advantage of the unique capabilities of Vision Pro. We pity the Apple employee charged with making sure none of those apps use oldspeak.

“The era of spatial computing has arrived,” said Tim Cook, Apple’s CEO.

“Yup,” said everyone else.