Hands-on Review of Meta Ray-Ban Display Glasses at Meta Connect 2025
Meta's latest innovation, the Ray-Ban Display glasses with Neural Band controls, shows both the exciting potential and current limitations of consumer AR. While the display quality and gesture controls feel almost magical when they work, the glasses' weight, locked ecosystem, and restricted developer access prove we're still in early days. Our Senior Software Engineer, Bram De Coninck, got hands-on time with these glasses at Meta Connect 2025, and his verdict is clear: while they're an impressive preview of what's coming, but not yet ready for mainstream adoption.
When Meta unveiled their Orion AR glasses last year, they proved what many thought impossible: truly wearable augmented reality. While I missed that historic moment, I wasn't going to let another breakthrough pass me by.
As an XR enthusiast, I believe Meta is one of the leaders in the field. They keep pushing XR forward. There were rumours about an upgraded version of the Ray-Ban Meta glasses: glasses with a display that you can control with a Meta Neural Band that uses electrical signals. How exciting is that!
When the invitation to Meta Connect 2025 landed in my inbox, I was fortunate enough that In The Pocket immediately backed my journey to California. During the main keynote, Mark Zuckerberg walked on the stage wearing the new glasses and showing his point of view. This teaser really captured the attention of the audience. Later during the keynote they finally announced these glasses, the Meta Ray-Ban Display glasses, and what they're capable of. After a two-hour wait at Facebook HQ to test these newly announced Meta Ray-Ban Display glasses myself, I've got some clear-eyed insights to share with you.

The event was divided over two days, and both followed the same structure: we'd start with a keynote in Menlo Park, then shuttle buses brought us to Facebook HQ for the Connect After Hours portion. You could essentially do the same activities on both days, though the developer keynotes were only available on the second day.
Walking around Facebook HQ felt like exploring a small village, with booths and demo areas scattered everywhere. Each booth showcased different aspects of Meta's latest hardware and software.
I was able to try out so many cool things during those hours! I tested the new 2nd-gen Meta Ray-Ban glasses and also got hands-on time with the new Oakley Vanguard Meta glasses, which are specifically designed for sports.There were demo stations for upcoming VR titles, including Marvel’s Deadpool VR game and a Star Wars pod racing experience. One demo that particularly excited me was of a game called Demeo x Dungeons & Dragons: Battlemarked. It’s a mixed reality experience that adds card game mechanics into Dungeons and Dragons.

Testing the Meta Ray-Ban Display Glasses
Meta Neural Band gestures explained
When I got to test the glasses, a Meta employee handed me a pair and started with instructions on how to perform the hand gestures with the neural band. To help us practice these gestures, we were asked to play a simple maze game where we controlled a character that avoided enemies. You navigate by moving your thumb across your index finger in all directions: up, down, left, and right. It was very intuitive once you got the hang of it, and it served as a great introduction to gesture controls.
The Demo Experience
After learning the basic gestures, I tested the glasses' core features. The camera controls allowed seamless photo and video capture through hand gestures. Music playback was easily controlled through the neural band, with surprisingly powerful built-in speakers.
The standout feature was smart glasses live captioning, which displays real-time transcriptions of whoever you're looking at while filtering out other voices. While the translation feature wasn't working due to missing language packs, it's designed to support multiple languages.
Meta's smart glasses show impressive potential, with sharp displays and intuitive gesture controls that feel magical when working properly. However, the inconsistent neural band response and noticeable weight suggest the technology still needs refinement before it's ready for all-day wear.

Should you buy these glasses?
…if you're a regular consumer?
Are Meta Ray-Ban Display glasses worth it right now for everyday use? As a regular consumer, I'm not interested in these glasses right now. While the technology is impressive, there are several significant limitations that make them hard to recommend for everyday use.
- The weight issue: These glasses are noticeably heavier than regular eyewear, and I'm genuinely unsure whether most people would want to wear them all day.
- You're completely locked into Meta's ecosystem: You cannot install third-party apps on the glasses. For instance, you'll have to use Meta's Maps instead of Google Maps.
- Restricted availability: Meta currently has no plans to release these glasses in Belgium. They’ll be launched later this year in the US and they’re coming to selected European countries like France sometime in 2026.
…if you're a developer?
Initially, I was very excited about developing for these glasses. I'd love to create my own apps for them, but it's not possible in the way I imagined. Meta announced the "Meta Wearables Device Access Toolkit" during their developer keynote, which got me really excited. However, after learning more about it throughout the day, I lost my enthusiasm.
The way I understand it, you can develop for Meta Ray-Ban Display via mobile apps that have access to the speakers, microphones, and cameras of the Meta Ray-Ban Display glasses. However, you cannot create apps with UI that display directly on the glasses' screen.
The toolkit essentially lets you use the glasses as advanced sensors for your mobile app, which is useful but much more limited than I'd hoped.
Conclusion
My experience at Meta Connect 2025 was incredible. The event showcased Meta's vision for wearable technology, and I really enjoyed getting hands-on with their latest innovations.
The Meta Ray-Ban Display glasses are definitely an important step toward true AR. They're not full AR glasses yet, but the display quality is impressive and when the neural band works properly, it feels almost magical. But as my testing showed, we're still early. The weight issues, being locked into Meta's ecosystem, and the limited developer options mean these aren't ready for most people yet. They feel more like a cool preview of what's coming rather than something you should buy today.
However, Meta seems committed to improving things based on feedback. Once they finally launch glasses similar to Orion and let developers go wild, we could possibly even see a real platform shift from smartphones to glasses.
