At Meta’s recent Connect conference, CEO Mark Zuckerberg introduced the company’s new AI-powered smart glasses with high expectations. But two live demonstrations went awry on stage, leaving the audience with more questions than confidence about whether the product is truly ready for prime time.
What Happened on Stage
- Zuckerberg showcased the new “Meta Ray-Ban Display” glasses, along with a Neural Wristband that detects gestures for control. The glasses are meant to offer voice and gesture-based interaction, live translation, messaging, and even guided assistance.
- The first demo involved influencer Jack Mancuso trying to use the glasses’ AI assistant to follow a cooking recipe. Things broke down quickly: the AI ignored a simple query about “What do I do first?” multiple times, and offered odd or irrelevant responses.
- In the second demo, Zuckerberg attempted a live video call with the wristband-glasses combo. Despite several attempts, the call failed to connect visibly on the glasses. Viewers heard ringing, but saw no incoming screen. After multiple tries, Zuckerberg admitted “I don’t know what happened” on stage.
Meta’s Response and Context
- After the event, Meta’s CTO acknowledged the failures, attributing one of them to overloaded internal systems. During the cooking demo, an AI component activated for all glasses in the audience, triggering a kind of self-inflicted traffic jam that hampered the device’s ability to respond properly.
- The issues seem not to be with the underlying hardware or features (which Meta insists are functional) but with the live environment—network conditions, simultaneous device usage, and software bugs that didn’t appear during rehearsals.
Implications & Reactions
- The live demo failures were embarrassing—and in live tech presentations, they tend to stick with the product narrative. Potential users, reviewers, and the press will likely scrutinize how much polish remains before release.
- Meta has committed to a launch date in late September, meaning that the window to iron out these issues is short. The expectation now is that patches, firmware updates, and perhaps even UX (user experience) tweaks will follow soon.
- But even supporters say that the core idea—wearable AI assistants that respond to gestures, voice, and offer real-time utility—is strong. The glasses’ features are impressive on paper. The question is whether the live performance will catch up to the promise.
Broader Take-Away
- This is another reminder that demoing emerging tech live remains fraught with risk. Connectivity, environment, timing—all can derail even the most carefully prepared presentations.
- For Meta, whose reputation currently hinges heavily on AI investments, this hit risks undermining some of the momentum. It may also make users more cautious until the device is tested in real-world conditions.
- From an industry viewpoint, such glitches do not necessarily kill a product or a strategy—but they raise the bar for reliability, especially when charging premium prices and competing in a market with high expectations.
















Leave a Reply