Every tech company seems to be flirting with wearable AI gadgets, but the way they take shape is all over the map. Glasses are here with Meta and soon Google, while OpenAI’s hardware project with Jony Ive is still a mystery. Apple, meanwhile, could release glasses too. Or maybe not.
A new report from The Information says Apple’s working on a small pin about the size of an AirTag that would have its own camera and work as an AI wearable. Why? Right now, Apple has no other place to put an outward-facing camera. Cameras are the key ingredient at play for any AI service that wants to not just respond to your voice but also give you feedback on what you’re seeing in the room.
Would a pin be a stepping-stone to glasses?
Cameras are a big part of why AI glasses are on the rise. It’s relatively easy to put a camera in a glasses frame near your eyes and have it work with AI to recognize objects its sees and offer advice and information. A pin or a pendant provides a way for AI to «see» the world without you having to wear a pair of glasses. Lots of companies have already tried this, or are still trying, with and without cameras.
The way I see it, Apple’s collective wearable hardware ecosystem gives it an advantage over competing AI wearable contenders. Take AirPods, for example.
Multiple reports have said that new AirPods Pro models are coming this year with infrared cameras that can recognize hand gestures. For regular AirPods use, that seems excessive. But for interacting with AI? Maybe it’s just right, especially if gesture controls are coming that could mimic what Meta’s doing with its neural wristband. And if combined with a camera-equipped pin, maybe it could all function as an experience that works display-free, so no need for smart glasses.
I think I’d just prefer good glasses that did all of this, but that’s the thing: That’s not easy to do right now, and battery life on smart glasses is still mostly under a full day. However, Apple might have its sights on smart glasses that have displays onboard as some sort of next step evolution of what they’ve already established on Vision Pro. For that, glasses might have to wait a little longer.
How many Apple products are going to interlink with AI? And when?
In the next few years, gesture-enabled AirPods, gesture-enabled Apple Watches (which already support taps and wrist shakes) and wearables like camera-equipped pins could start laying down groundwork for glasses or as an accompaniment. The question is how useful they’ll actually be.
I’m as excited about future tech as anybody you’ll meet, but I don’t want to wear an AI pin. I did it before, and I don’t like the idea of doing it again. Maybe Apple needs to work with its rivals on AI to make its wearables compelling.
Apple and Google’s AI partnership, built off Gemini, points to the AI advances that could be coming to Apple’s products soon. A Mark Gurman report at Bloomberg says Siri’s going to be rebuilt as a generative AI chatbot. Gemini can already do a lot of live and camera-assisted AI that Google and Samsung will employ on a new wave of smart glasses this year. Apple could be doing the same and even work that AI awareness into its Vision Pro headset.
When I see news of an AI pin, I see it as just one piece of a hardware puzzle that includes AirPods and Watches — Apple accessories working together. A pin would be one more thing to carry, sure, but at least it would be cheaper than glasses and bypass any need to figure out prescription support.
Either way, this reported pin isn’t supposed to get here until 2027. Apple may show an early preview this year, much like they have for future products like Vision Pro, Apple Watch and HomePod in the past.
The thing I’m keeping an eye on is Apple’s whole planned AI evolution, which could finally be happening after the disappointing launch of Apple Intelligence. I haven’t seen camera-enabled AI services truly wow me yet on glasses, but the potential is clearly there. Apple looks ready to dive into that game soon, too, whether it’s on our face, pinned to our shirts or who knows where else. But then they’ve got to figure out how to better design what exactly we’d be using this camera-enabled AI for in the first place.
