Mark Zuckerberg largely makes use of the brand new Meta Ray-Ban Show glasses to ship textual content messages. Numerous them.
He has been carrying the glasses across the workplace, firing off WhatsApp pings to his execs all through the day. “I run the corporate by means of textual content messages,” he tells me just lately.
“Mark is our primary heaviest consumer,” Alex Himel, the corporate’s head of wearables, confirms. Zuckerberg is understood for sending prolonged, multi-paragraph missives by way of textual content. However when he’s typing on the glasses, Himel can inform as a result of the messages arrive quicker and are a lot shorter.
Zuckerberg claims he’s already at about 30 phrases per minute. That’s spectacular contemplating how the glasses work. The heads-up show isn’t new; Google Glass tried it greater than a decade in the past. What’s new is the neural wristband Meta constructed to regulate the interface and kind by way of refined gestures. As a substitute of monitoring your fingers visually or forcing you to sort out into empty air, the band picks up alerts out of your arm’s muscular nervous system. “You possibly can have your hand by your aspect, behind your again, in your jacket pocket; it nonetheless works,” Zuckerberg says. After making an attempt it, I can affirm he’s proper. It appears like science fiction come to life.
“Glasses, I believe, are going to be the following computing platform machine.”
“Glasses, I believe, are going to be the following computing platform machine,” says Zuckerberg throughout our latest dialog, which airs in full on the Access podcast Thursday, September 18th. He argues they’re additionally the most effective {hardware} for AI: “It’s the one machine the place you’ll be able to mainly let an AI see what you see, hear what you hear, speak to you all through the day, after which when you get the show, it might simply generate a UI within the show for you.”
Whereas Zuckerberg has been advocating for this concept concerning the subsequent main platform for some time, numbers — not simply hype and flashy demos — are actually starting to help his principle. Gross sales of Meta’s present Ray-Bans have reached the single-digit tens of millions and elevated by triple digits from final yr. The broader marketplace for tech-enabled eyewear is projected to succeed in tens of tens of millions quickly. Google is releasing AI glasses subsequent yr, and Snap has a client pair of AR glasses transport then as effectively. I anticipate Apple to launch its personal glasses as quickly as 2027 — the identical yr that Meta is focusing on to launch its a lot pricer, full-fledged AR glasses.
For Zuckerberg, the prize is big. “There are between 1 to 2 billion individuals who put on glasses every day at present for imaginative and prescient correction,” he says. “Is there a world the place, in 5 or seven years, the overwhelming majority of these glasses are AI glasses in some capability? I believe that it’s form of like when the iPhone got here out and everybody had flip telephones. It’s only a matter of time earlier than all of them grow to be smartphones.”
Meta’s CTO Andrew Bosworth recollects how EssilorLuxottica initially thought the show glasses could be too large to promote as Ray-Bans. “Then, final yr, we confirmed it to them. They have been like, ‘Oh, you probably did it. Let’s put Ray-Ban on it.’” They’re nonetheless chunky, however much less noticeable than the Orion AR prototype Meta confirmed off final yr. With transition lenses, the brand new show Ray-Bans begin at $800 earlier than a prescription. Bosworth says the goal prospects are “optimizers” and “productivity-focused folks.”
Meta isn’t making a lot of them — reportedly a pair hundred thousand — however Bosworth predicts “we’ll promote the entire ones that we construct.” After I ask Zuckerberg concerning the enterprise potential, he hints that the actual margin will come later: “Our revenue margin isn’t going to return from a big machine revenue margin. It’s going to return from folks utilizing AI and the opposite companies over time.”
The {hardware} feels surprisingly refined for a primary model. The geometric waveguide show sits to the aspect of the best lens, clear sufficient to make use of in daylight, with a 20-degree subject of view and crisp 42 pixels per diploma. The neural band allows you to pinch to convey up the show and dismiss it once more. You possibly can’t see the show from the entrance in any respect, even when it’s turned on. The glasses last as long as six hours on a cost, with a number of recharges from the case.
The core software program nonetheless depends in your telephone, but it surely’s greater than a notification mirror. You possibly can ship texts, take audio or video calls, present what you’re listening to by way of the audio system within the body, get turn-by-turn strolling instructions, see what your digital camera is capturing, and run Meta AI to acknowledge what’s in entrance of you. Bosworth calls crisp textual content rendering the important thing to creating AI helpful. “If the AI has to learn it again to you verbally, you’re not getting essentially the most data,” he says. “Whereas you’ll be able to simply ask a query and it exhibits you the reply. It’s a lot better. It’s extra personal, too.”
The long-term wager is that the glasses ultimately allow you to go away your telephone behind
Whereas the AI options performed a extra backseat function in my demo, I did use it to acknowledge a portray on a wall and generate a desk setting out of skinny air. I made certain to ask it issues that have been off the script I used to be given, and it nonetheless carried out as anticipated. The show additionally exhibits AI-suggested immediate follow-ups you’ll be able to simply choose by way of the neural band.
Probably the most hanging demo was reside captions. In a loud room, I may take a look at somebody a number of toes away, and what they have been saying appeared in actual time in entrance of me. It appears like tremendous listening to. Language translation is subsequent, with Spanish and some others supported at launch. Meta can also be engaged on a teleprompter function.
Nonetheless, Meta thinks that is simply the beginning. “In case you take a look at the highest 10 causes you are taking your telephone out of your pocket, I believe we knocked out 5 – 6 of them,” Bosworth says. The long-term wager is that the glasses ultimately allow you to go away your telephone behind.
The neural band often is the greater unlock within the close to time period. Bosworth admits Meta solely dedicated to it after the Orion AR glasses prototype proved its usefulness final summer time. Now, it’s advancing quicker than anticipated. A handwriting mode initially regarded as years away is already working. Zuckerberg envisions it going even additional. “It’s mainly simply an AI machine studying downside. The longer term model of that is that the motions get actually refined, and also you’re successfully simply firing muscle mass in opposition to one another and making no seen motion in any respect.”
Along with enabling personalised autocomplete by way of thought, the neural band can also grow to be a strategy to management different gadgets or perhaps a sensible dwelling, in keeping with Zuckerberg. “We invented the neural band to work with the glasses, however I really suppose that the neural band may find yourself being a platform by itself.”
For now, the primary technology of those Ray-Ban Show glasses is clearly a tool for early adopters. The AI options are restricted, the neural band takes follow, and the software program wants sharpening. However Zuckerberg appears extra satisfied than ever that glasses are the long run, and after making an attempt his new glasses, it’s exhausting to disagree. “It’s 2025, we now have this extremely wealthy digital world, and also you entry it by means of this, like, five-inch display screen in your pocket,” he says. “I simply suppose it’s a bit of loopy that we’re right here.”
If historical past repeats, Meta could lastly be on the cusp of the brand new platform Zuckerberg has been dreaming about for years. “The primary model of the Ray-Bans, the Ray-Ban Tales, we thought was good,” he says. “Then, after we did the second model, it bought 5 occasions extra, and it was simply refined. I believe there’s going to be an identical dynamic right here. The primary model, you be taught from it. The second model is much more polished. And that simply compounds and will get higher and higher.”
That is Sources by Alex Heath, a e-newsletter about AI and the tech business, syndicated only for The Verge subscribers as soon as per week.

