I got a rare demo of Sony's new XR headset at CES 2024 and here's what I learned
In one of the untidy corridors at the back of the Las Vegas Convention Center -- about a football field's distance away from the splashy, pristine Samsung and LG booths that define CES -- I slipped through a metal door that had a quickly-scribbled note taped to it warning people not to enter because there was a media briefing in progress.
In this non-descript room with a handful of folding chairs and tables, a projector screen, and a high-powered laptop with a head-mounted display attached to a long black USB-C cable, I got a private demo of the product that was the biggest surprise of CES 2024. I also got a solo opportunity to chat with a handful of proud employees involved with the product.
Also: CES 2024: What's Next in Tech
Sony's XR headset burst on the scene at the opening keynote of CES 2024 on Monday night, just a few hours after Apple announced availability and preorder information for its long-anticipated Vision Pro headset. The Sony headset is the first to use Qualcomm's new XR2+ Gen 2 chip that was just announced last week and aims at enabling hardware makers to produce high-end XR devices to compete with Apple Vision Pro. Most of these Qualcomm XR2+ Gen 2 headsets will be powered by Android, and Sony confirmed to ZDNET that its headset is also running Android.
The headset features an 8K display using 4K OLED micro displays for each eye. It also includes a pair of wearable controllers -- a ring and a pointer -- that allow for precise selection and effective manipulation of 3D objects in virtual spaces. The idea is to hold the pointer in your dominant hand and the ring in your other hand. It also features a flip-up display so that you can easily hop out of a session in the headset and hop back in -- a feature I wish every headset would offer.
What it's for and how it will work
My interview with Sony and Siemens also confirmed that the headset is primarily focused on two things:
- Helping engineers, designers, and product leaders accelerate product development by creating "digital twins" (virtual versions of real-world products) to streamline creation, collaboration, and product approval.
- Empowering the next generation of content creators to build immersive spatial experiences, including 3D and XR entertainment that will ultimately populate Vision Pro, Meta Quest, and other headsets and smartglasses.
The Sony headset doesn't even have an official name yet, and there's a good reason why.
It will not be sold as a standalone device, but as part of a product package with Siemens NX Immersive Designer -- which software organizations will use to create, collaborate, iterate, and manipulate product designs across global teams. That could save enormous amounts of time and resources by replacing long flights for employee travel and mailing physical prototypes around the world with collaboration on "digital twins" in virtual spaces.
Also: Sony's new XR headset is more 'pro' than Vision Pro and has 2 features Apple needs
Sony will also have other partners who will package their software platforms with the headset to sell in specific verticals and industries. Sony emphasized the entertainment industry as a place where it has deep roots and it plans to use the headset as a "spatial content creation platform" that will put tools into the hands of movie makers, game creators, and other storytellers to build a new wave of immersive content. This will also include Sony's proprietary motion capture system called "mocopi," which can do full body tracking. This makes it a powerful platform for 3D computer graphics and animation.
My first impressions of the Sony XR headset
When I slipped on the headset for the first time, I found it comfortable and very well-balanced. The head pads weren't quite as soft and cushion-y as the Vision Pro, but it wasn't nearly as heavy either. I immediately loved the flip-up display, just as much as the one on the Lynx R1 (which Sony clearly drew inspiration from) and even more than the flip-up face straps like BoboVR on Meta Quest.
The virtual environment that I hopped into felt like the wing of a museum with columns and sculptures and had a very pristine feel to it. Floating in front of me were about half a dozen objects -- a camera (Sony Alpha, of course), a sheet of type that looked like it was from the page of a large book, a photograph, an art print, a page of science equations with white type on a black background, and a couple other things.
I was able to grab the camera, spin it around, and hit a button to take a photo of anything in the digital environment. Then the photo I captured would float above the camera in the virtual space. I could also grab the page of text and read it clearly from about 2 feet to about 6 inches away from my face. The text was perfectly sharp and clear. That's where the 8K display showed its value.
Overall, the fidelity and resolution of the screen was stellar -- very sharp, high contrast, good color, and it was incredibly smooth and fast-loading when I turned to look at different things. I've used a lot of different headsets and the quality and performance of Sony's headset is matched only by Apple Vision Pro and the Varjo XR4.
Also: The best VR headsets right now (and they're not just from Meta)
Let's talk for a moment about the wearable controllers. The ring and pointer have very precise pointing accuracy as well as intuitive buttons to select things. The difference between using them and using hand tracking feels like it's a gross motor skills vs. fine motor skills issue. In other words, it's like the difference between picking up an Apple and picking up a penny. There are plenty of things people will be able to do with hand tracking alone. But for more precise uses, the ring and pointer make a lot of sense.
One of the coolest things I discovered was that the pointer slides over your finger and the main part of it sits in your palm when you're using it in XR. But when you switch to using a keyboard at your computer, the pointer slides around and can rest on the other side (the back) of your hand. That felt very natural and intuitive. My one small complaint was that I wished that the ring was a little looser and more adjustable around my finger and I wished the pointer fit with a little more snugness around my index finger.
One thing I didn't expect was that the headset was also tethered to a laptop. While the headset has its own processor and runs Android, Sony said it also needs the USB-C connection to a laptop to power the Siemens NX Immersive Designer software. Sony calls this "split-rendering" and says it balances the load between the headset and the computer and that allows for a stable rendering of hi-res 3D objects.
Also: Siemens' CES showcase: Transforming industries with mixed reality, AI, and more
Lastly, I will note that this was a brief demo with only a few simple scenarios. It wasn't nearly as in-depth, nor did it have as many diverse experiences as the Apple Vision Pro demo I did at WWDC 2023. Nevertheless, the fidelity and smoothness of the user experience was quite impressive. And as I've already written, Vision Pro could learn from what Sony's created to better accommodate professionals.