If you haven’t heard, AI now has eyes, and Meta has announced some enhancements for its Ray-Ban Meta glasses. You can now customize Meta AI to give detailed responses based on what’s in the surrounding environment for your smart glasses, Meta said in a blog post for Global Accessibility Awareness Day.
Artificial intelligence is opening a whole new world for accessibility, with new features coming out in droves. Tech giants like Google, Apple and Meta are putting forth a ton of effort to create a world where people with disabilities, such as low or no vision, can more easily interact with the world around them.
While Live AI for the Meta glasses has been around, the additional enhancements for low vision users will undoubtedly be welcomed.
Below are some of the other highlights from Meta’s accessibility-focused blog post. For more, check out the glimpse of brain accessibility features headed to Apple devices.
‘Call a volunteer’ feature expanding to 18 countries
While not an AI-focused feature, the Meta and Be My Eyes feature, Call a Volunteer, will soon be expanding to all 18 countries that Meta AI is currently available in. Launched in November 2024 in The US, Canada, UK, Ireland and Australia, the expansion of Call a Volunteer will be a very handy (and hands-free) feature for low vision users.
Once set up, a Meta glasses user can simply ask AI to «Be My Eyes.» From there, you’ll be connected to one of over 8 million volunteers that will be able to view the live camera stream from your glasses and provide real-time assistance for whatever you need help with. The feature will be available to all supported countries later this month.
Meta additional research and accessibility features
Meta also detailed some of its existing features and research taking place in its effort to expand accessibility for its products, especially in the extended reality space.
- Features like live captions and live speech are currently available on devices like the Quest, Meta Horizon and Horizon Worlds.
- Also shown was a WhatsApp chatbot from Sign-Speaks that uses its API and Meta’s Llama AI models. The chatbot allows live translation of American Sign Language to text and vice versa to create easier communication between deaf and hard of hearing individuals.
For more, don’t miss the handful of new accessibility features announced by Apple.