Meta revealed a brand-new model of AI-powered smart glasses this week, with CEO Mark Zuckerberg saying they represent the “ideal form of superintelligence” – when a computer or device becomes more intelligent than humans.
Combining elements of both artificial intelligence and virtual reality into one wearable device, the Meta Ray-Ban Display glasses come with lofty promises, and many have been impressed with their features.
But at the product launch, some of the features still “appeared unfinished”, said The New York Times. Wearing the new glasses, Zuckerberg asked them to provide a recipe for barbecue sauce and call a colleague. The glasses failed to do either. “They tell us not to do live demos,” Zuckerberg said to the crowd just after the slip-up.
With this new model, though, Meta is “raising its bets on eyewear”, said the NYT. Previously, the company advertised smart glasses and VR headsets as options for people to explore the “metaverse”, an online world Zuckerberg “has called the future of the internet”.
This time, the glasses are designed to help wearers carry out everyday tasks, from following a recipe to taking photos, without needing to pick up a smartphone. They are set to launch on 30 September in the US and early next year in the UK, priced at $799 (£586).
What can the new smart glasses do?
Fitted with a built-in screen that’s “nigh impossible for people around you to see” and controlled by a wristband – which reads “signals from your muscles so that you can control the display with gestures” – the glasses function like a “pop-up extension” of an iPhone, said Victoria Song in The Verge.
The glasses connect directly to Meta AI, allowing wearers to generate answers to questions based on what they can see and hear. Users can take photos, scroll through Instagram, respond to text messages, and follow map directions using subtle hand movements. The glasses also provide live captions to real-life conversations, switching from speaker to speaker as the user turns their head.
What does Zuckerberg mean by ‘superintelligence’?
Zuckerberg described the new model as “the world’s first mainstream neural interface”, and said people without AI-powered glasses will likely be “at a pretty significant cognitive disadvantage” compared to those who embrace the technology. He believes that the glasses’ ability to “see what you see, hear what you hear, and then go off and think about it” will push Meta closer to achieving “superintelligent” technology, said the NYT.
The launch is just one element of Meta’s all-in focus on AI. This summer, Zuckerberg “personally approached dozens of top AI researchers” from competitors like OpenAI and Google, offering millions in sign-on bonuses, said the Financial Times. The company has also reorganised its AI team four times in the last six months, most recently dubbing it the “Meta Superintelligence Lab”.
How has the launch been received?
Even though the new model delivers on new features, “there’s a reason why the phrase ‘glassholes’ exists”, said Jason England in Tom’s Guide. The “social stigma” around wearing smart glasses has meant they haven’t caught on as widely as smartphones.
But “consumer smart glasses might really take off” now, and “not just because Meta’s execution is excellent”, said Song. The new model offers many more potential uses, appealing to people beyond the most staunch tech enthusiasts. This might be “the closest we’ve ever gotten to what Google Glass promised over 10 years ago”, she said.
Meta unveiled a new model of AI smart glasses this week, with some features appearing ‘unfinished’ at a less-than-perfect launch