Smart glasses have been a dream for years, but most early attempts left users wanting more—lighter frames, better audio, or, most importantly, a real display. With the first-generation Ray-Ban smart glasses, people marveled at the open-ear audio and sleek design, but the one question always came up: Do they have a screen? Until now, the answer was no. At Meta Connect 2025, that changed. Meta unveiled the Ray-Ban Display smart glasses, finally integrating a full-color heads-up display (HUD) into a stylish, lightweight frame.
On the surface, another screen may seem unnecessary, but once you experience it, the convenience and possibilities are undeniable. From navigation and notifications to photos, videos, and even live transcription, the Ray-Ban Display is poised to redefine what smart glasses can do, making wearable AR technology both practical and surprisingly magical.
Read More: What Defines a Great Assignment? Tips from Assignment In Need Writers
A Screen in Your Glasses: What It Does
So, what exactly can you do with this screen? In short: apps, notifications, navigation, and media. Unlike earlier rumors suggesting a monochrome display, Meta Ray-Ban Display delivers a vibrant full-color heads-up display (HUD). You can see messages, maps, pictures, and videos directly in your line of sight.
The initial experience is a little jarring. While the glasses, weighing 69 grams (about 10 grams heavier than the first-gen), try not to shove a screen in front of your eyes, the HUD is undeniably present. It hovers subtly but effectively, ready to grab your attention with notifications. Once your eyes adjust—a minute or so—it becomes surprisingly natural to interact with.
Introducing the Meta Neural Band
The real game-changer, though, is Meta’s Neural Band—a compact sEMG wristband the size of a fitness tracker. This little device reads electrical signals in your hand to register pinches, swipes, taps, and wrist movements as commands for your glasses.
Initially, I worried it might feel clunky or conspicuous, but it’s lightweight and surprisingly subtle. The glasses themselves remain comfortable despite being slightly thicker than the first-gen Ray-Bans. More importantly, the band is responsive.
- Pinch your index finger and thumb: Select
- Pinch your middle finger and thumb: Back
- Make a fist and move your thumb over it: Scroll
It’s part Vision Pro, part Quest 3, and it works without hand-tracking cameras. When it functions fluidly, it feels almost magical.
Learning the Controls
While input accuracy may vary initially—you might have to try gestures a couple of times—the system works well for a first-generation device. Over time, navigating the UI will likely become intuitive. Meta plans to add a handwriting feature, which I saw in action but couldn’t try myself. Early demos suggest it has potential, but the real proof will come with personal use.
Real-World Applications
After some hands-on time, I experienced the full range of phone-adjacent features. A few highlights:
Photography Made Simple
One standout feature is POV photography. A small window in the glasses shows exactly what the camera sees, eliminating guesswork. You can even pinch and twist your wrist to zoom, adding a surprisingly satisfying, tactile control over your shots.
Navigation in Your Field of View
Navigation overlays maps in your HUD, allowing you to follow directions without looking down at a phone. The display is bright enough—5,000 nits—to be legible outdoors, and Meta’s safety alerts warn against using navigation while moving too quickly. I couldn’t fully test it in real-world movement, but the interface was sharp and clear in sunlight.
Video Calling from Your Perspective
Video calls display the person you’re talking to in a small window, while your POV is shared with them. It’s unusual—you usually want the other person to see you—but it worked seamlessly during my demo.
Live Transcription and Accessibility
Meta Ray-Ban Display includes live transcription, superimposing spoken words onto your HUD. This could be a game-changer for accessibility and language translation. The glasses are designed to focus on the speaker you’re looking at, although some background words still slip through. Even so, the feature shows immense promise.
Why the Neural Band Matters
If the glasses had one standout feature, it might not be the screen at all, but the Neural Band. Navigating smart glasses UIs has historically been a challenge, and this wristband offers a solution that feels intuitive and futuristic. First-of-its-kind devices often frustrate, but the combination of HUD and sEMG control feels like a breakthrough.
Final Impressions
The Meta Ray-Ban Display smart glasses impress on multiple fronts:
- Comfortable and lightweight design
- Bright, full-color HUD
- Responsive and intuitive gesture controls via Neural Band
- Practical features like POV photography, navigation, video calls, and transcription
There are still questions—like how the HUD functions in high-speed movement, or how accurately live transcription filters background noise—but Meta is clearly leading the smart glasses race. With this head start, they may remain the frontrunner for some time.
Even in early demos, the Ray-Ban Display shows that the smart glasses dream is no longer a distant concept—it’s here. And with innovations like the Neural Band, we’re seeing a glimpse of what truly seamless wearable technology could look like.
Frequently Asked Questions
Do the Meta Ray-Ban Display glasses have a screen?
Yes! Unlike the first-gen Ray-Ban smart glasses, the Display version features a full-color heads-up display (HUD) for notifications, navigation, photos, videos, and more.
How heavy are the glasses?
They weigh approximately 69 grams, about 10 grams heavier than the first-generation model without a screen, and remain comfortable for extended wear.
What is the Neural Band?
The Neural Band is a wrist-worn sEMG device that reads electrical signals from your hand to allow gestures—like pinches, swipes, and wrist turns—to control the glasses’ UI.
Can you take photos and videos?
Yes! You can capture POV photos and videos directly through the glasses, with a live view displayed in the HUD. Pinch and wrist gestures allow zooming and control.
Do the glasses support navigation?
Yes, maps are overlaid in the HUD, and the display is bright enough to use outdoors. Meta also provides safety alerts if moving too quickly.
Is live transcription available?
Yes. The glasses can transcribe conversations in real-time and focus on the speaker you’re looking at, making it useful for accessibility and potential translation.
Conclusion
The Meta Ray-Ban Display smart glasses represent a significant leap forward in wearable technology. By combining a full-color heads-up display with the innovative Neural Band, Meta has addressed one of the biggest challenges in smart glasses: intuitive, hands-free navigation. From POV photography and navigation to live transcription and video calling, these glasses offer a compelling mix of practicality and futuristic appeal. While some minor kinks remain—like gesture variability and background noise transcription—the overall experience is impressive for a first-of-its-kind device.