I Witnessed the Future of Smart Glasses at CES. And It’s All About Gestures


In a corner of the bustling showroom floor of CES 2025I feel like an orchestra conductor. As I slowly waved my arm from side to side, the cello notes played out on the giant screen in front of me. The faster my arm moved, the faster the bow slipped through the strings. I even got a round of applause from fellow attendees in the booth after a quick performance.

This is what it feels like to use the Mudra Link wristband, which allows you to manipulate devices using gesture controls. Motion controls are nothing new; I remember using touchless controls since 2014 with devices like Your bracelet. The difference today is that gadgets like this have a greater reason for existence thanks to the arrival of smart glasses, which seem to be everywhere at CES 2025.

Startups and big tech firms alike have been trying to make smart glasses happen for more than a decade. However, the advent of AI models which can process speech and visual input made them together feel more relevant than before. After all, digital assistants can be even more helpful if they can see what you’re seeing and answer questions in real time, like the idea behind it. Google’s Project Astra prototype glasses. Shipments of smart glasses are expected to grow 73.1% in 2024, according to a September IDC reportfurther indicating that tech-equipped spectacles are starting to catch on.

Read more: Nvidia CEO Explains How New AI Models Will Power Future Smart Glasses

Check it out: These New Smart Glasses Want to Be Your Next AI Companion

Last fall, Meta showed off its own prototype pair of AR glasses, called Orioncontrolled by gestures and a neural-input wristband. At last year’s Augmented World Expo conference for AR, other startups showed similar experiments.

At CES, it became clear that companies are thinking about how we will navigate these devices in the future. In addition to the Mudra Link bracelet, I found a couple of other accessories that can be used with the glasses.

Take the Afference Ring, for example, which applies neural haptics to your finger to provide tactile feedback when using gesture controls. It’s intended for devices like smart glasses and headsets, but I got to test a prototype of it paired with a tablet just to get a feel for how the technology works.

In a demo, I played a simple mini golf game that required me to pull my arm back to putt and then release to launch the ball. As I pulled back, the haptics on my finger got stronger. The experience of toggling the brightness and audio sliders is the same; as I turned up the light, the sensation in my finger felt more prominent.

affection-ring.png

The Afference ring provides haptic feedback to your finger.

Nic Henry/CNET

It was a simple demo, but one that helped me understand the kind of approach companies can take to apply haptic feedback to menus and mixed reality apps. Afference did not mention any specific partners it is working with, but it is worth noting that Samsung Next is participating in Afference’s seed funding round. Samsung launched its first smart health tracking ring last year and announced in December that it was building the first headset to be used in the newly announced Android XR platform for future mixed reality headsets.

The Mudra Link wristband works on the newly announced TCL RayNeo X3 Pro glasseswhich launches later this year. I briefly tried the Mudra Link wristband to scroll through an app menu on the RayNeo glasses, but the software didn’t complete.

I spent most of my time using the wristband to manipulate graphics on a giant screen used for demo purposes at the conference. The cello example was the most impressive demo, but I was also able to grab and stretch the face of a cartoon character and move it around the screen just by waving my hand and squeezing my fingers. finger.

Halliday’s smart glasses, also unveiled at CES, work with a companion ring for navigation. Although I didn’t get to test the ring, I did use the glasses for a while to translate language in real time, with text translations appearing instantly in my field of view even on the noisy showroom floor.

A woman with red hair adjusts a pair of black framed glasses

the Halliday smart glasses put a small screen in your field of view, and you can navigate the device with the companion ring.

James Martin/CNET

Without gestures, there are usually two main ways to interact with smart glasses: touch controls on the device, and voice commands. The former is good for quick interactions, such as swiping a menu, launching an app or dismissing a call, while the latter is useful for calling and commanding virtual assistants.

Gesture controls make it easy to navigate interfaces without having to raise your hand to your face, speak out loud or hold an external controller. However, there is still a certain awkwardness that comes with using gestures to control a screen that is invisible to everyone but the person wearing the glasses. I can’t imagine waving my hands in public without any context.

Meta is already moving toward motion-controlled glasses, and its CTO, Andrew Bosworth, recently told CNET that moves will likely be required for any future pair of display-enabled glasses.

If CES is any indication, 2025 is shaping up to be a big year for smart glasses — and gesture control will undoubtedly play a role in how we navigate these new spatialities. future interface.

CES 2025: See the 35 Coolest Tech Products We Can’t Shake

See all photos





Source link

  • Related Posts

    The Best Fat Burners Are Not Here Yet

    Ozempic is just the beginning of a new era in the treatment of obesity. A review published this week looks at the emergence of similar experimental drugs that may be…

    Panasonic Z95A OLED TV Review: Blazing Light and Colors

    Photo: Ryan Waniata Thanks to excellent photo processing, details pop, from crystal 4K Blu-ray to 1080p and even upscaling to 720p. Only the fuzziest SD images fail to impress, sometimes…

    Leave a Reply

    Your email address will not be published. Required fields are marked *