With the Samsung Galaxy XR on my head, it looks somewhere between a Meta Quest and Apple’s Vision Pro. It’s big, weird and definitely not something I’d wear outside. But this expensive new mixed reality headset, a partnership between tech giants Samsung and Google, is just the first phase of their plan.
The next step is smart glasses designed for everyday wear, inside and out.
Something’s happening inside the Galaxy XR that’s unlike any other VR headset I’ve used. It’s eye-opening and maybe a little scary. Gemini AI can see what I see — and not just around me in the real world through cameras embedded all over the headset. As my second set of eyes, Gemini AI can also see the virtual screens I’m browsing. It’s easy to imagine it living invisibly inside innocuous glasses, too.
And that’s exactly the plan, according to Samsung’s COO of Mobile Experiences, Won-Joon Choi, and Google’s head of Android, Sameer Samat. In exclusive conversations, they told me what’s coming next. We still don’t know the price and availability of the next wave of smart glasses, but we know they’re coming.
Google and Samsung are working with Warby Parker and Gentle Monster to make AI glasses that would compete with Meta’s Ray-Ban and Oakley EssilorLuxottica partnership. The Galaxy XR is a doorway to that — a hint, really, of how AI will evolve not just on glasses but between glasses, phones, watches, rings and more.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Figuring out contextual AI that can «see»
Meta and Google have discussed so-called contextual AI as the next big step forward in assistive AI agents. Contextual, in this sense, means being aware of more things that you’re doing: apps you’re using, places you’re visiting and, most significantly, what you’re looking at or listening to in the moment.
Meta’s glasses have Live AI modes that can tap into the camera and microphones. Google’s doing the same on the Galaxy XR headset with Gemini AI, while it is also adding awareness of open apps, virtual experiences and the real-world environment. We’ve already seen hands-on demos of what Google and Samsung’s future glasses can do.
«We believe that XR headsets and glasses are the most natural devices that can realize this multimodal AI,» says Samsung’s Choi. «We are setting the foundation of upcoming new waves, of different forms people are likely to use in their daily lives.»
In the short term, smart glasses, not elaborate and expensive VR headsets, are likely to be the most affordable and accessible contextual AI products.
«We didn’t create this OS just for the headset,» Choi says of Android XR, noting that it will also cover glasses.
«Glasses are a really important form factor for the future of AI,» says Google’s Samat.
The Galaxy XR can understand the world around you as well as screens, something that could point to how glasses start to understand both the everyday world and nearby devices with screens that are in view.
«You’ll be looking through your AI glasses at other computing surfaces,» Samat says. That relationship is what needs the most work to figure out. «There’s cooperation between the glasses and these other computing services, which are really interesting, in particular the phone.»
Samat points to phones powering the computing on glasses and wearables filtering information from the phone — just like a smartwatch — but deeper and maybe also more interactive.
Choi also sees glasses working with phones to power the processing, something Google’s already exploring with Xreal via Project Aura. Qualcomm’s work on trying to build glasses powered by phones also comes into play: Snapdragon Spaces, Qualcomm’s phone-to-glasses software conduit, is part of Android XR.
Glasses could connect through wearables, watches and rings
Unlike the touchscreen of a phone, smart glasses don’t have an obvious interface for control. Connected peripherals for future glasses could make them feel less awkward.
Meta’s Ray-Ban Displays introduced a custom-made neural band for gestures, but that band doesn’t do any other useful watch-like things yet. Samsung and Google, meanwhile, have watches and rings galore, and signs are strong that they’ll integrate them into the glasses picture soon.
Though Android XR is a new category, Choi says it will be integrated into the Galaxy ecosystem, «meaning that whatever you do in the smartphones, watches, glasses or headsets, they will be connected.»
Choi also acknowledges that watches with wrist-worn screens would be good accessories for display-free glasses. «Suppose you have glasses without a display. We do have a lot of other devices, even wearable devices, that have a display so you can utilize that.»
Expect fitness and health to be a big focus
I’m still surprised that the Samsung Galaxy XR doesn’t seamlessly connect with WearOS watches or Fitbits to track health in exercise apps. But you should expect Android XR smart glasses to push more health and fitness functions.
Meta has already begun connecting the dots with Garmin watch integrations, Strava and the sports-focused Oakley Meta Vanguard. Samat and Choi see similar possibilities for Google and Samsung products.
While Samsung’s Choi still sees VR headsets like the Galaxy XR as needing to overcome hurdles to integrate fitness, glasses are a better fit.
«When we launch glasses, I think we have applications related to fitness and health — for example, when you bike.» There’s also interest in nutritional tracking via glasses, using them to input calories or help scan food items.
A prototype for future AR glasses, and maybe more AI models
Google’s Samat admits that the $1,799 Galaxy XR aims as much at developers and businesses as it does at consumers.
«It’s not something you’d wear out and about, but from a development platform standpoint, it gives you the same inputs that you might want to utilize if you were building for a pair of AR glasses,» Samat says. «We will encourage folks to utilize it in that way.»
I’m also curious whether Android XR will support other AI models for glasses, beyond Gemini. Right now, at least, Google is making Gemini the main AI option, but that could change down the road.
«Android’s always been built in an open way, so we definitely believe that there will be others that come and build things and in all these environments. On the phone, you can clearly see many different AI technologies. I don’t think XR will be different in that regard,» says Samat. «I believe, like I said, we’re at the beginning of this.»

