Meta’s highly anticipated unveiling of its latest smart glasses at the Connect 2025 event in Menlo Park, California, on September 17 was meant to showcase the tech giant’s leap into the future of wearable AI. Instead, the livestreamed reveal quickly became a viral sensation for all the wrong reasons, as a string of technical glitches left CEO Mark Zuckerberg and his team scrambling on stage—and the internet buzzing with reactions.
From the outset, the event was marked by a series of hiccups that played out in real time for viewers worldwide. During a live cooking demonstration meant to highlight the new AI-powered “Live AI” feature, the system failed to recover after chef Jack Mancuso interrupted the process. The AI began giving instructions that didn’t match the actual ingredients on the table, leaving the chef and audience perplexed. Moments later, a live WhatsApp call failed to connect, and a demo of real-time translation stumbled before finally working on a second attempt. Zuckerberg, confronting the mishaps with a wry grin, told the crowd, “I keep on messing this up,” prompting supportive cheers but little comfort as social media users quickly seized on the spectacle. One YouTube commenter summed up the mood: “Cool!! See you in 10 years when it works : D.”
Despite the setbacks, Meta pressed forward with its showcase, determined to prove that the future of smart glasses had arrived. The star of the show was the new Meta Ray-Ban Display glasses, a sleek pair of frames with a tiny display embedded in the right lens. Unlike previous versions, which relied on audio cues, these glasses offer visual feedback—letting wearers read messages, view photos, scroll through Instagram Reels, and even take video calls, all without glancing down at their phones. The display appears projected several feet in front of the user, visible only to them, and can be turned off for moments of focused conversation or work.
Navigation is handled by the Meta Neural Band, a wrist-mounted controller that lets users swipe through restaurant menus, select items for purchase by pinching their fingers, and interact with apps using subtle hand gestures. Voice controls remain, but now users have a quiet, tactile alternative—ideal for situations like texting discreetly in a movie theater or checking directions while walking through a busy city. The glasses also boast live captioning and translation, displaying real-time conversation text in the corner of the lens. According to CNN, the device aims to help users “spend less time looking down at your phone—thanks to a tiny screen an inch from my eyeball.”
Meta’s ambitions go beyond convenience. As Zuckerberg put it during the keynote, “Glasses are the only form factor where you can let AI see what you see, hear what you hear, talk to you throughout the day… so it is no surprise that AI glasses are taking off.” He believes that those who don’t adopt smart glasses may soon find themselves at “a significant cognitive disadvantage.” Meta’s director of AI glasses, Ankit Brahmbhatt, echoed this sentiment, telling CNN, “We built this product to help protect presence… We designed this to be a glanceable display, so that it’s there for you when you need it, you get in for seconds at a time, you get the information, then it’s kind of out of your way.”
The Ray-Ban Display glasses are priced at $799 and will be available to customers starting September 30, 2025. For athletes and fitness enthusiasts, Meta also revealed the Oakley Meta Vanguard, a sportier model equipped with specialized cameras, louder speakers, and a battery life of up to nine hours. The Vanguard integrates with fitness platforms like Garmin and Strava and is set for release on October 21, with preorders at $499. Zuckerberg and artist Diplo demonstrated the Vanguard’s durability by literally running off stage to the after-party, underscoring its athletic appeal.
Yet, for all the hardware glitz, the event exposed Meta’s Achilles’ heel: software. As CNET reported, the AI features—so central to Meta’s vision—were far from flawless. The “Live AI” assistant stumbled during the cooking demo, and issues with calls and translations left many wondering whether Meta’s AI is ready to compete with the likes of OpenAI and Google. The keynote made it clear that while Meta’s hardware is impressive, its software still lags behind the “superintelligence” Zuckerberg envisions. In response, Meta has invested billions to recruit top AI scientists, hoping to close the gap and deliver on its ambitious promises.
For consumers, the new glasses offer tantalizing possibilities. Imagine walking through a city with directions floating in your field of view, or responding to messages without ever reaching for your phone. The neural wristband allows for subtle, almost invisible interactions, and the display is designed to be used in quick glances—addressing concerns about screen addiction and digital distraction. As CNN noted, “This idea of being more heads up and not having our heads buried in our phones… is a really a big part of the kind of experience that we’re trying to unlock here.”
But the promise of ever-present AI raises old questions about privacy and social norms. The glasses are equipped with an LED indicator to signal when recording is active, a nod to concerns that date back to the days of Google Glass, when wearers were sometimes dubbed “glassholes” for their perceived invasiveness. Brahmbhatt emphasized that “building responsibly” is a focus for Meta, acknowledging the need for public education about the device’s safety features. Still, some remain uneasy about the prospect of being recorded—or simply ignored—by someone wearing a computer on their face.
Meta’s move into smart glasses is also a strategic play in the broader tech landscape. The company is vying to outpace rivals like Apple, whose Vision Pro headset retails for a hefty $3,499, and to establish itself as the leader in AI-driven wearables. DIGITIMES Asia projects that Meta expects to sell over 100,000 units of the new display glasses by year’s end—a significant milestone, if achieved, for a product that only a decade ago seemed like science fiction.
As the dust settles from the Connect 2025 event, one thing is clear: Meta’s smart glasses are both a technological marvel and a work in progress. The hardware dazzles, the software sometimes falters, and the public remains divided—some excited, some skeptical, and many simply curious about what comes next. The real test will come when the glasses hit store shelves on September 30, and everyday users—not just tech reviewers and executives—decide whether Meta’s vision of augmented reality is one they want to see through their own eyes.