American businessman Mark Zuckerberg who co-founded the social media service Facebook, has unveiled a new generation of AI-powered smart glasses, introducing features that move far beyond audio and camera functions during the Meta Connect 2025 at Menlo Park, California.
The Meta Ray-Ban Display is smart glasses with an integrated visual display, controlled by your hand movements.
“We’re moving beyond the phone. This is AI you wear,” said Zuckerberg during the keynote address.
Meta’s flagship product this year, the Ray-Ban Display, introduces a full-colour, high-resolution display hidden inside the right lens. It’s visible only to the wearer and enables real-time visuals directly in your field of view.
Its key features include a view text and media from apps like WhatsApp and Instagram, turn-by-turn directions while walking, real-time captions and translations during conversations, and live video calling and camera viewfinder with zoom.
“It feels like telepathy,” one Meta engineer said, describing the user experience.
Introducing the Meta Neural Band
To make the experience hands-free, Meta is pairing the glasses with a new accessory: the Meta Neural Band. This slim wristband uses electromyography (EMG) to read electrical signals from hand and finger muscles.
Its gesture controls include pinch to select items, slide your thumb to scroll, and twist your wrist to adjust volume.
It’s an intuitive way to control your digital experience without reaching for a device.
The Meta Ray-Ban Display costs approximately $799 USD (includes the Meta Neural Band).
The device is set for official launch in the US September 30, 2025 while international rollout is set for early 2026 in the UK, Canada, France, and Italy.
Alongside the flagship glasses, Meta also introduced an updated version of its display-free AI eyewear. The Ray-Ban Meta Gen 2 focuses on improved performance and subtle enhancements
Zuckerberg made it clear that AI is moving from software to wearable presence. The goal is for AI to see, hear, and respond in real-time—without a screen or keyboard.
“AI should understand your environment and respond instantly,” he said.
Meta calls this ambient computing technology that quietly integrates with your daily life, allowing you to stay present while staying connected.
With AI glasses now offering live captions, gesture control, visual responses, and direct messaging, Meta is positioning them not just as wearables—but as the next evolution of personal computing.
Written by Erastus Omondi, TV47