More

    AI Is Going to Wrap Itself Around You, From Your Glasses to Your Car

    Qualcomm’s Chief Marketing Officer Don McGuire and I are sitting inside a giant AI machine — the electric, sensor-packed 2025 Mercedes GLC. He’s telling me that cars will become «digital living spaces,» and I can see what he means. If I had to pick one car to live in, it would definitely be this one.

    The car is a showcase of Mercedes’ partnership with Qualcomm, which has contributed its Snapdragon Digital Chassis platform to the car in order to create an immersive cockpit capable of AI-driven voice interactions. We sit back in our luxurious leather seats and watch a brief recap of McGuire speaking on stage at the company’s Snapdragon Summit via YouTube on the GLC’s giant infotainment screen. «Yes — I could quite happily hang out in here all day,» I think.

    While we enjoy this digital living space on wheels stationed outside of the Summit conference halls on Maui, McGuire explains to me how the car — just like our phones, and just like the smart glasses and watches and rings we’re increasingly wearing — is set to become part of a personal ecosystem of ambient AI.

    As the company that makes the chips that go inside everything from the top Android phones, to laptops, to wearables and yes, cars, Qualcomm is thinking several years down the line when it comes to AI. It’s been at the forefront of enabling AI agents that can process complex tasks, taking the initiative to suggest, predict and accomplish tasks on our behalf. Putting these agents inside cars, the thinking goes, would lift the burden on us, turning them into interactive havens of productivity, fun and relaxation.

    «We can’t think of a more hands-free, natural-language, voice-interactive, agentic experience than a vehicle,» McGuire says.

    Whether you’re driving, sitting in traffic, waiting for school pickup or just having a moment of downtime in your car, the combination of multiple screens, cameras and microphones means you can interact both with things inside and outside of the car, he adds.

    You could request that an AI agent rearrange your schedule based on traffic predictions or ask it questions about a restaurant you see and allow it to book you a spot for your next date night if the reviews are good, for example.

    When your AI car becomes your AI glasses

    I’m interested to understand how exactly the car will seamlessly fit into the burgeoning ecosystem of AI-enabled devices. I ask McGuire how he envisions the car that we’re in will interact with another AI-driven piece of tech, such as the Oakley Meta smart glasses he’s sporting.

    «We’ve had a little bit of a debate on this,» he tells me. His feeling is that if you’re walking down the street using your glasses to engage with an AI agent then you get into your car, the most obvious thing is for the car to take over that agentic experience from the glasses, as it has all of the sensors and cameras needed to understand everything going on around you.

    «What we don’t want is confusion between the two, and I think the safer bet is to take the glasses off so you avoid distraction and you’re fully immersed in the driving experience,» he says. «It’s probably a safer, more intuitive experience if the car becomes your glasses.»

    As with so many existing pieces of technology, AI does seem to be breathing new life into cars — giving us fresh ways to interact with them and elevating them beyond machines that get us from A to B.

    One example that particularly impressed McGuire is the way BMW, in partnership with Qualcomm, has integrated symbiotic drive into the iX3, as announced earlier this month. The idea, he says, is «that driver’s assistance is not really linear, or it’s not really a stop-start, but it should be more fluid and it should move with you.»

    If you need to take your hands off the steering wheel for a moment to take a bite of your burger or swat away a pesky insect, the car can take over on the fly and then hand control back to you when you’re fully back at the wheel.

    AI is breathing new life into the familiar

    With familiar products like glasses and cars evolving to take on more complex roles in our lives, I ask McGuire how we should be prepared for our devices to change. Not so long ago, he says, everything was a peripheral, with the phone at the center.

    «Now those peripherals themselves are becoming smarter, and they’re gonna have capabilities to do things on their own, whether they’re still tethered or whether they’re not tethered,» he adds.

    Headphones are another example of a product that once had a single use — to listen to audio — and are now, with the addition of Snapdragon Wear chips, gaining new skills and capabilities, including as conduits for interacting with AI. As the chips improve, more capabilities will be added allowing for more standalone experiences, says McGuire. «It gives these devices that were maybe unilaterally good for one function new life.»

    He’s also excited by what could eventually be possible for AI devices. Like with cars and wearables, it will be driven by sensors, he says: «AI is going to be ambient in a lot of ways.» It might not even be called a «device» if it’s something woven into your clothing or worn on your person, he posits.

    «There’s lots of ideas out there floating around,» he says. «You’ve got OpenAI and Johnny Ive working on stuff. You’ve got others.»

    Glasses, while still at a nascent stage, will be a quickly growing product category, especially off the back of Meta’s success, he says. But McGuire still thinks there’s something beyond that will deliver on the promise of personal and ambient AI.

    «The phone’s still the phone, the watch is still the watch,» he says, «but what is that thing that’s going to be next that creates a whole new scenario for you as you’re moving through your day and you happen to not have your phone with you?»

    Qualcomm’s role in all of this is to push the boundaries of technology and build the platform for what will be possible, he adds. The company then works with partners to bring those platforms to life through the devices we all know and love now and those we will know and love in the future.

    «Oftentimes we do reference designs to just give a flavor,» says McGuire. «Seeing is believing, for some people to spur that creativity. And then sometimes people bring ideas to us and then we help elaborate on those ideas.»

    Mastering the AI learning curve

    Future-facing concepts, especially where AI is involved, can sometimes feel a little too nebulous and overwhelming for people to wrap their heads around, I point out. McGuire acknowledges that there will be an adoption curve that will depend on the experience of using new technology being easy, fun and genuinely useful to people.

    «The more you make it natural, the more you make it fluid and the more you make it personal and safe and private for the person… you’re going to reduce the barriers, which then drives the willingness to try,» he says.

    Those using Meta’s glasses tend to enjoy the convenience and practicality of being able to listen to Spotify without headphones and capture pictures without getting their phones out, he adds. AI, he expects, will follow the same curve.

    How you feel about AI probably differs based on where you are in the world, says McGuire. He fears there’s often misunderstandings about its different manifestations — from the personal (agentic, on-device experiences), to the physical (robotics), to enterprise and industry.

    «AI is not just one thing,» he says. «The closer it is to the human where the data is actually generated, the more personal it can be, the more private it can be and the faster it can be.»

    It’s an optimistic image — one in which AI not only serves us but impresses us.

    I consider the car we’re in and imagine how it would feel to hand over to AI the many burdens and anxieties I often experiences while driving: timing; scheduling; weather conditions; pedestrian safety; cyclists; finding a podcast to listen to; wondering where I can stop for a decent coffee; remembering I haven’t replied to an important message; realizing I never made that reservation; fearing I’ll forget all of this by the time I get home.

    I can envisage the feeling of relaxation that would come with driving a luxury SUV that could anticipate and assist me with my every whim. The barrier falls. I can confidently say, I am willing to try.

    Recent Articles

    spot_img

    Related Stories

    Stay on op - Ge the daily news in your inbox