More

    Google’s Putting It All on Glasses Next Year: My Demos With Project Aura and More

    What if I told you that, just a couple months after Google and Samsung released the AI-infused, immersive Galaxy XR headset, I got to wear a pair of display glasses that can do almost the same thing? Xreal’s Project Aura, a glasses-sized alternative to bulky headsets, is ready to slide into your jacket pocket next year.

    Google announced its Android XR intentions a year ago, promising a return to AR and VR fueled by a lot of Gemini AI in a range of product forms from VR headsets to glasses. The Samsung Galaxy XR, a mixed reality headset similar to Apple Vision Pro, was the first Android XR product released, and these Xreal glasses could be the next. .

    Project Aura wasn’t the only thing Google showed me. I also got to try on the latest iteration of Google’s upcoming competitors to Meta Ray-Ban smart glasses. They’re coming next year from eyewear partners Warby Parker and Gentle Monster, and they’ll work with Google’s watches. And Samsung’s Galaxy XR headset added more features: a Windows PC connection app, and photo-real avatars in beta called Likenesses that are similar to Apple’s Vision Pro Personas.

    Google’s trumpeting out that it’s serious again about glasses, just as Meta is ramping up its efforts and Apple could be around the corner with glasses of its own. While you may not even know if you want to wear smart glasses yet, Google’s taking a multi-product approach that makes a lot of sense now that I’ve done the latest demos. It’s the way these new glasses work with phones, apps and even watches that make me the most interested.

    Project Aura: Xreal glasses with a whole range of Android apps

    Sitting on a sofa with Project Aura on my face, the prototype glasses immediately felt like VR shrunken to a far smaller form. I launched a computer window wirelessly, streamed from a nearby PC, and controlled it with my hands. I laid out multiple apps. I circled a lamp in the room with a gesture, which caused a Google search. And I launched and played Demeo, a VR game, using my hands in the air.

    The most astonishing part to me? All this was possible with just a pair of glasses, even if they were tethered to a phone-sized processor puck. They also had a 70-degree field of view. Yes, that’s smaller than what I see with VR headsets, but honestly more than enough to experience immersion. It felt like my VR and AR worlds were colliding.

    Project Aura uses an adapted form of Xreal’s existing display glasses. The puck contains the same Qualcomm XR Gen 2 Plus chipset used in the Galaxy XR. Using Aura, wandering around the room, was the closest I’ve seen to full AR glasses outside of Meta’s Orion demo a year ago, or Snap’s bulkier Spectacles. But unlike Orion, Aura’s being released as a real product next year, at a price that should be lower than Vision Pro or Galaxy XR. (Snap’s next version of Spectacles are coming next year, too.)

    Also unlike Orion, Project Aura doesn’t have full transparent lenses. Instead, it bounces its displays down from the top of the glasses, giving an AR effect but with some extra prism-like lens chunks in between (much like Xreal’s One Pro glasses).

    Display glasses grow cameras, a puck and Gemini AI

    Xreal already has a whole lineup of tethered display glasses that work like headphones for your eyes. Glasses in this category have existed for years, acting as plug-in monitors for phones, laptops and game handhelds.

    Aura’s difference is it also adds three cameras that can do full-room tracking and hand tracking, plus take photos and videos that can be recognized by Gemini’s AI. Inside, there’s a larger and higher-resolution micro OLED display that’s better than what I saw on Xreal’s existing glasses line.

    Project Aura really does look like the rest of Xreal’s glasses, and it can also work like them, too, plugging into laptops and phones. But the processing puck gives it extra power. Google’s team told me the Qualcomm chip (Snapdragon XR 2 Gen 2 Plus) can run all of the apps that the Galaxy XR headset can, and I tried a few demos to back those claims.

    First, I ran through the same setup demo I tried on Galaxy XR, where I pinched floating 3D cubes that hung in the air, using my fingers. Aura’s hand tracking overlaid onto my own hands, and everything felt like augmented reality floating in the room in front of me.

    I also used a new PC Connect app to wirelessly hook into a nearby laptop, casting the Windows monitor in front of me. Xreal’s glasses, and others, can do this already when tethered with a USB-C cable. But the PC Connect mode also lets me use hand tracking to point and click on apps and control the Windows screen, something Apple still can’t do with Macs via Vision Pro. I was able to launch Android XR apps side by side, too, like a YouTube video window.

    I even played a bit of Demeo, a Dungeons and Dragons-like tabletop game made for VR. It ran on the glasses, projecting the board in the room in front of me, and I used my hands to zoom in and control. Game cards even sprouted from my hand when I looked at them, just like in the AR versions of Demeo on Vision Pro and Galaxy XR. It was damn impressive.

    My demo wasn’t perfect: sometimes the room tracking slipped a bit, drifting off before re-centering. But this early demo showed me what could be done in hardware so small. Xreal’s glasses lean on their own custom chip that can recognize three cameras at once and help with hand tracking, doing all the things VR headsets usually do. Aura doesn’t have eye tracking, but the hand tracking was enough for me to point and click in apps and do everything I needed to on the fly.

    I even walked across the room, pinched my fingers to invoke Google’s Circle to Search, and drew a line around a floor lamp in the room. I saw instant Google results pop up in front of me on where to buy it. Much like the Galaxy XR headset, these glasses can circle-search my world, too.

    In a lot of ways, Aura reminded me of the promises of glasses-connected computing that I saw with Spacetop earlier this year, but Aura can go further. It’s flat displays, and it’s also 3D. It’s really, truly an AR device.

    A headset meant to come with you

    I used prescription lens inserts to wear Aura, much like I do with Xreal’s existing glasses line. They worked well, and everything looked great. But these aren’t all-day glasses. They’re meant to be used on the go, like a work device, and then folded away. But considering all they could be capable of, they seem like a far better proposition than lugging a bigger mixed reality VR headset.

    According to Google and Xreal, this will be an actual product going on sale next year, not a development kit. Much like Samsung Galaxy XR was previous called «Project Moohan,» Project Aura should get a proper name next year, and a price.

    Chi Xu, Xreal’s CEO and founder, tells me that these are very much a stepping stone as well as a doorway to wireless glasses in the future. They’ll also be the first Google-partnered glasses-form devices that will run full Android apps.

    «We’re not trying to solve the all-day wearable capability,» says Xu. «But you’re going to find that you can totally use this for hours.»

    Xu says Google’s working on standalone all-day glasses that will eventually aim for what Aura can already do, but in the meantime, this is building the pieces on how it’ll work with phones.

    «We really want to polish this experience first,» he said. «And once we really have a great experience with the [processing] puck, with glasses, the next question will be, wow, when can we replace the puck with our phone?»

    Also on deck: Google’s next wave of fashion glasses, with Maps, Uber, watch support and more

    I also tried on Google’s Ray-Ban-like smart glasses again, which are nearing release next year too. The Samsung, Qualcomm and Google co-developed hardware will be arriving via Warby Parker and Gentle Monster, in versions with and without displays. According to Google, the release of the glasses will be a bit staggered in 2026, possibly with the non-display models coming first.

    It’s also a lot clearer how they’ll work. I tried a few app demos that show how the glasses will connect with phones. These glasses won’t run apps, necessarily, but will show rich notifications pulled from Android phones, along with hook-ins that feel like apps on demand. For instance, I asked for Google Maps directions, saw turn by turn instructions float in front of me. When I tilted my head down, I saw a map on the floor that turned as I turned, showing my location.

    An Uber demo showed how heads-up info could appear on the glasses, and I could look down to see walking directions to my pickup spot.

    Google’s glasses will support a wide range of prescriptions, I was promised, unlike Meta’s limited prescription support for its Display glasses this fall.

    «Suffice it to say, we take it pretty seriously that these are glasses first and foremost,» says Juston Payne, Google’s director of product management for XR

    The glasses will also work with iPhones, but via the Google app. On Android phones, however, Gemini will hook into the full OS to do a lot more, feeling like an extension in a similar way to how Pixel earbuds and Android watches already do.

    The glasses will also work with watches, Payne confirmed, supporting a limited set of gestures and taps. They’ll work as optional control accessories, almost like Meta’s neural band for its glasses, but Google’s glasses could also send things onto watch screens.

    «If you have the glasses with no display, and you take a picture, you can look at your wrist and see what that picture looks like,» says Payne. «You’re talking to Gemini, Gemini has some visual response, look at your wrist.»

    But the display-enabled glasses will let you see more, on a bigger screen, and not need you to look at your watch at all.

    Google’s display-enabled glasses can work with head gestures to make different things appear when you look up or down, like my Maps and Uber demos showed me. They’ll also let you watch YouTube videos: I saw a quick clip as a demo. The Micro LED color display, made by Raxium, was viewable, but in a smallish virtual window in one that’s not as vibrant as a regular phone screen (and semi-transparent). But it’s good enough for catching a quick social clip, or maybe heads-up instructions. The display area’s large enough, too, to make a video call via Google Meet which showed me a Google rep on-screen while I shared my video view of a bookshelf in front of me, showing that thumbnail in a smaller sub-screen next to it.

    The glasses also look smaller than Meta’s Ray-Ban Displays, more similar to the display-free Meta Ray-Bans…and lightweight, too. Although the glasses I tried are just developer hardware prototypes, I’d still wear them around.

    I also got a peek at a dual-screen pair that, while not arriving in 2026, could come in 2027. I watched 3D video clips on it, also via a color Micro LED display. The glasses weren’t much bigger in size, even with the dual displays onboard.

    Samsung Galaxy XR still getting updates, too: photo-real avatars, PC connection

    Where does this all leave Google and Samsung’s just-released VR/AR Galaxy XR headset? Good question. While Google and Samsung are clearly glasses-focused, the messaging is clearly that Galaxy XR still represents the full-feature set where the companies are aiming for. It’ll remain the testbed for apps and deeper AI, even if it might not be the thing most people end up wearing.

    I got to look at a few new Galaxy XR features, too, which are arriving now. One’s a PC-connecting app called Connect that bridges a wireless link to Windows PCs, and supports hand tracking to control apps and windows. A new more photoreal set of avatars, called Likenesses, are rolling out in beta too. They resemble Apple’s sometimes-uncanny Persona avatars, although Google and Samsung still haven’t unleashed theirs to move outside of chat windows.

    I also saw peeks at new 3D autoconverting tools using AI: in Maps, photos of places can now look nearly 3D-scanned using Gaussian splats, a trick that’ll be moving into photos soon (it looks a lot more immersive than the 3D conversions that already exist). Google’s also auto-converting YouTube videos to 3D, too, but at lower frame rates.

    I’m more interested in what the glasses bring. And I’m sure others will be, too. But right now, Samsung Galaxy XR is Google’s only product that’s out. At least, until sometime in 2026 when the glasses floodgates open in a whole bunch of ways.

    Recent Articles

    spot_img

    Related Stories

    Stay on op - Ge the daily news in your inbox