Meta’s AI-powered Ray-Bans are life-enhancing for the blind


Pomeroy, 49 years old, recently bought a pair of Ray-Ban Meta smart glasses, and now a voice inside them does all of those things for her and more—from helping her get dressed and prepare meals to sorting laundry and reading books to her 3-year-old granddaughter. She also frequently uses the glasses to get the time of day and current temperature outside the couple’s Warsaw, Ind., home.

“I depend on them so much,” said Pomeroy, adding that she finds herself saying “Hey Meta”—the prompt used to access the glasses’ AI assistant—even when they aren’t on her face.

Facebook parent Meta Platforms didn’t design its souped-up glasses specifically for the visually impaired. But a growing group of blind users like Pomeroy are finding the devices to be more of a life-enhancing tool than a cool accessory for gadget geeks.

“It’s giving significant accessibility benefits at a price point people can afford,” said Jonathan Mosen, an executive director at the National Federation of the Blind, a nonprofit in Baltimore. “We would like to see Meta continue to invest in this.”

Meta’s glasses initially struggled to gain traction with users but have since grown in popularity. They are an important part of the company’s hardware strategy and were promoted in a commercial during the Super Bowl. Still, Meta’s Reality Labs unit, which oversees the product as well as its virtual- and augmented-reality goggles, reported losses of nearly $5 billion in the fourth quarter.

At a recent employee meeting, Meta CEO Mark Zuckerberg said the company sold more than one million units of its Ray-Bans in 2024, according to a recording of the meeting reviewed by The Wall Street Journal. Research firm IDC says Meta leads the U.S. smart-glasses market, followed by Amazon.com.

Made in collaboration with Italian eyewear giant EssilorLuxottica, Meta’s Ray-Bans come with a camera, microphones and an artificial-intelligence assistant capable of identifying objects, answering questions and more. They retail for around $300—slightly more than the cost of most other smart glasses meant for the general public—and they look ordinary, an appealing characteristic for folks who don’t want to draw attention to a disability.

Some critics say blind users face risks in using smart glasses that weren’t intended to be a health product. John-Ross Rizzo, a professor at New York University’s School of Medicine who is blind and whose research covers assistive technology, said more studies are needed when using off-the-shelf gadgets to ensure safety for users.

About seven million people have impaired vision in the U.S., including one million who are blind, according to the Centers for Disease Control and Prevention.

Mosen, the nonprofit executive, is blind and owns a pair of Meta’s glasses. He has used them a few times to record video of ride-share drivers refusing to give him and his wife a ride because she travels with a guide dog. Denying rides to people with service animals is illegal in many countries, including the U.S.

Another concern for blind users is that AI assistants in general are prone to making errors, or so-called hallucinations, which may not be apparent. Aaron Preece, who is blind and editor in chief of American Foundation for the Blind’s AccessWorld magazine, said Meta’s glasses recently failed to correctly read the number on the door to his home.

“I just can’t trust it,” he said. “It’s more of a novelty than something I’d use on a day-to-day basis.”

Leaning on any kind of general-purpose technology for health-related support can potentially be dangerous, warned NYU’s Rizzo. For example, blind people wearing any kind of smart glasses might become distracted while listening to an AI agent provide feedback, taxing their cognitive reserve, and ultimately increasing their susceptibility to an accident, he said.

“If we’re using these as medical devices, we really need to understand more,” said Rizzo.

Freddy Abnousi, vice president of health technology at Meta, said the company tries to build products with accessibility in mind, though he didn’t predict the impact its Ray-Bans would have on visually impaired people. Once he and other employees started hearing from such users and their supporters, accessibility-use cases became increasingly apparent.

One caller stood out: Mike Buckley, a former Facebook executive who is now CEO of Accessibly, a free smartphone app called Be My Eyes that connects people with poor vision through video calls to sighted volunteer helpers. It has more than 770,000 users.

Buckley pointed out that Meta’s Ray-Bans could be more helpful to the visually impaired if it supported Be My Eyes because they would no longer need to hold up their phones to show what is in front of them. Software engineers from both companies worked together to integrate the app. They also recruited hundreds of blind and low-vision users for beta testing. The app launched on the glasses in November.

David Tatel, a retired federal judge in Castleton, Va., is 82 and has been blind for more than half his life. After Tatel bought Meta’s glasses, his guide dog lost a light that had been attached to her collar. While holding her leash in one hand, he used voice commands to place a video call to a Be My Eyes volunteer, who spotted it. Tatel was then able to grab the light with his free hand. He described the combination of the glasses and the app as a “double game-changer.”

There are a number of tech products designed specifically for the blind, such as Victor Reader Stream, a $550 device that reads aloud books, newspapers and more. Another is Dot Pad, which lets users interact with visual information through touch and is listed for sale by some U.S. distributors for around $12,000. Envision, a brand of smart glasses with software for blind and low-vision users, cost between $1,899 and $3,499.

Orlando Johnson, 49, lost his vision a decade ago and said he couldn’t find any smart glasses that met his budget and needs until he learned about Meta’s last year.

Before buying Meta’s glasses, Johnson, who lives in Las Vegas, said he was afraid to go for a walk alone, fearful of getting lost or hit by a vehicle. That fear is now gone because the tech specs can identify or describe most objects within the camera’s sight, including the ability to recognize details such as the difference between diet and regular soda.

“It’s been life-changing,” he said.

Write to Sarah E. Needleman at Sarah.Needleman@wsj.com



Source link

Related posts

Snapchat’s AI video lenses are here—But only for platinum subscribers, how to use them

Training AI models might not need enormous data centres

BenQ Zowie XL2566X+ review: After a month with this 400Hz monitor, everything else feels slow