close
close

The next big thing is still… Smart Glasses

Last week, Mark Zuckerberg stood on a stage in California, holding what appeared to be a pair of thick black glasses. There was a Latin inscription on his loose T-shirt that seemed to compare him to Julius Caesar…aut Zuck aut nihil—and made a bold declaration: This is Orion, “the most advanced glasses the world has ever seen.”

These glasses, which are still a prototype, allow users to make video calls, watch movies and play games in the so-called augmented reality, where digital images are superimposed on the real world. Demo videos at Meta Connect, the company’s annual conference, showed people playing Pong on the glasses, their hands act as oars, and they also use the glasses to project a TV screen onto an otherwise blank wall. “A lot of people said it was the craziest technology they had ever seen,” Zuckerberg said. And while you won’t be able to buy these glasses any time soon, Meta is selling much simpler products in the meantime: a new Quest headset and a new series of software updates for the company’s smart Ray-Bans, which have cameras and an AI audio assistant on board but no screen in the lenses .

Orion appears to be an attempt to combine these two devices, providing a fully immersive, computing experience in technology that people could comfortably wear on their faces. And this is not, as you may have noticed, the only smart glasses product that has appeared in recent months. Amazon, Google, Apple and Snap are either officially working on some version of this technology or rumored to be. Their implementations vary slightly, but they point to one idea: the future is about integrating IT more seamlessly into everyday life.

Smartphones are no longer exciting and the market for them has been declining for several years. The main new idea is foldable screens, which effectively allow you to turn your phone into a tablet – although sales of tablets have also fallen. Virtual reality goggles, which companies have spent billions developing, are not widely used.

These companies bet that people want to check the weather without taking out their smartphone and that they are more willing to wear Ray-Bans with a camera than to spend hours in the meta-world. And after years of false starts on the glasses front, they’re betting that artificial intelligence – despite a few high-profile flops – will finally help them achieve that vision.


Tech companies have been working on smart frames for decades. The first consumer smart glasses began to appear in the late 1980s and 1990s, but none took off. Finally, in 2013, Google released its infamous Glass glasses. With a thin metal frame housing a camera and a small screen above one eye, Glass can be used to check emails, take photos, and get directions. They were advanced for their time, but the public was terrified by the thought of face cameras constantly monitoring them. In 2015, Google abandoned the idea that Glass could ever become a consumer product, although the frames survived as an enterprise device until last year.

The Glass debacle did not discourage other companies from changing their minds. In 2016, Snapchat launched the first generation of glasses – glasses that allowed users to take photos and videos from cameras mounted above each eye and then post them to their account. In 2019, Amazon stepped in by promoting Echo Frames – cameraless smart glasses with Alexa built-in – which went on sale to the public the following year. Meta, then called Facebook, launched the first installment of its collaboration with Ray-Ban in 2021, although the frames did not catch on.

There are also virtual reality headsets, such as Meta’s Quest line. Last summer, following the announcement of Apple’s Vision Pro, my colleague Ian Bogost hailed this period as the “era of the headset,” pointing out that companies are spending billions developing immersive technology even though the exact purpose of these expensive headsets is unclear.

Consumers also seem to wonder what the purpose is. One analyst reports that sales of the Vision Pro were so low that Apple cut production. According to Informationthe company stopped work on the next model, while Meta completely abandoned the competitor’s device.

In a way, this putting on the glasses moment is something of a retreat: an admission that people will be less likely to go all out in virtual reality than to put on sunglasses that happen to be able to record video. These devices are designed to look and feel more natural, while also allowing you to take advantage of ambient features, such as the ability to play music anywhere by simply speaking or starting a phone call, without having to put on headphones.

Artificial intelligence is a big part of this offering. New advances in large language models are making modern chatbots seem smarter and more conversational, and the technology is already making its way into glasses. Both Meta and Amazon frames have built-in audio assistants that can answer questions (How do whales breathe?) and turn on the music (play “Teenage Garbage Man”). Meta Ray-Bans can “see” through their cameras, offering audio descriptions of everything in their field of view. (In my experience, accuracy may or may not be hit or miss: when I asked the audio assistant to find a book of poetry on my shelf, he replied that there wasn’t one, minus the anthology with the word poetry in the title, although it actually identified my copy of Joseph Rodota Watergate when I asked him to find a book about a Washington landmark.). On Connect, Zuckerberg said the company plans to continue improving its AI, with several major releases coming in the next few months. With these updates, the glasses will be able to perform translations in real time, as well as scan QR codes and phone numbers on leaflets in front of you. Artificial intelligence will also, he added, be able to “remember” things such as where you park your car. One demo showed a woman searching through her closet and asking an AI assistant for help choosing an outfit for a themed party.

However, whether AI assistants will actually be intelligent enough to realize this is still an open question. Generally speaking, generative AI has difficulty citing its sources and often gets things wrong, which can limit the overall usefulness of smart glasses. And although companies claim that the technology will get better, this is not entirely certain: Wall Street Journal recently reported that as Amazon tried to seed Alexa with new, large language models, the assistant actually became less reliable for certain tasks.

Products like Orion, which promise not only artificial intelligence capabilities but also full, seamless integration of the digital world with physical reality, face even greater challenges. It’s really hard to fit so many options into glasses that look semi-normal. You need to be able to fit batteries, camera, speakers, and processing chips into one device. Nowadays, even the most modern glasses require connection to additional equipment in order to use them. According to EdgeAlex Heath’s Orion glasses require a wireless “computing puck” that can be no more than about 10 feet away from them – something Zuckerberg certainly didn’t mention on stage. Snap’s newest glasses, unveiled earlier this month, require no additional hardware, but have a battery life of just 45 minutes and still look large and bulky. The hardware problem has plagued generations of smart glasses, and still no effective solution has been found.


But perhaps the biggest challenge facing this generation of smart glasses is neither hardware nor software. It’s philosophical. Nowadays, people are stressed about how deeply technology has penetrated our daily interactions. They feel addicted to their phones. These companies are pitching smart glasses as a solution – proposing that they could, for example, enable text messaging without disrupting time spent with a young child. “Instead of pulling out your phone, it’s just going to be a little hologram,” Zuckerberg said of Orion during his presentation. “And with a few subtle gestures, you can respond without breaking from the moment.”

But engaging in a world where devices are worn on our faces means engaging in a world where we can always be at least a little distracted. We could use them to silently read our emails or browse Instagram at a restaurant without our partner knowing. We could check our messages during a meeting and look like we were still paying attention. We may not need to check our phones as often because they will be effectively connected to our eyeballs. Smart glasses walk a fine line between helping us use the internet less obsessively and connecting us even more to it.

I spent some time this spring talking to many of the people who worked on early smart glasses. One of them was Babak Parviz, a partner at Madrona, a venture capital firm that previously led the Google Glass project. We discussed the history of computers: they were bulky items found in research environments – then came laptops, then smartphones. In the case of Glass, the team aimed to reduce the time needed to retrieve information to seconds. The question is: how far do you have to take it? Do you really need to be immersed in information all the time and have access to information much faster?” Parvis told me that he had changed his mind about what he called “information snacking,” or feeding yourself small portions of information throughout the day. “I don’t think it’s very healthy to constantly disrupt our regular flow of information by tapping into information sources.”

In interviews, I asked experts whether they thought smart glasses were inevitable and what it would take to remove a smartphone. Some saw glasses not as a smartphone replacement at all, but as a potential accessory. Basically, they thought that new equipment would have to give us the ability to do something we couldn’t do today. Today, companies hope that artificial intelligence will be the one that unlocks this potential. But as with most broader conversations about the technology, it’s unclear how much of the hype will actually pan out.

These devices still feel more like sketches of what could be than fully realized products. Ray-Bans and other such products can be fun and sometimes useful, but they still stumble. And while we may be closer than ever to the popular AR glasses, they still seem very far away.

Perhaps Zuckerberg is right that Orion is the most advanced pair of glasses in the world. The question is whether his grand vision for the future is what the rest of us actually want. Glasses can be amazing. Or they may just be another distraction.