For at least as long as Meta’s been selling conventional “smart” glasses (with partner EssilorLuxottica, whose eyewear brands include the well-known Oakley and Ray-Ban), rumors suggested that the two companies would sooner or later augment them with lens-integrated displays. The idea wasn’t far-fetched; after all, Google Glass had one (standalone, in this case) way back in early 2013:
Meta founder and CEO Mark Zuckerberg poured fuel on the rumor fire when, last September, he demoed the company’s chunky but impressive Orion prototype:
And when Meta briefly, “accidentally” (call me skeptical, but I always wonder how much of a corporate mess-up versus an intentional leak these situations often really are) published a promo clip for (among other things) a display-inclusive variant of its Meta Ray-Ban AI glasses last week, we pretty much already had our confirmation ahead of the last-Wednesday evening keynote, in the middle of the 2025 edition of the company’s yearly Connect conference:
Yes, dear readers, as of this year, I’ve added yet another (at least) periodic tech-company event to my ongoing coverage suite, as various companies’ technology and product announcements align ever more closely with my editorial “beat” and associated readers’ interests.
But before I dive fully into those revolutionary display-inclusive smart glasses details, and in the spirit of crawling-before-walking-before-running (and hopefully not stumbling at any point), I’ll begin with the more modest evolutionary news that also broke at (and ahead of) Connect 2025.
Smart glasses get sporty
Within the midst of my pseudo-teardown of a transparent set of Meta Ray-Ban AI Glasses published earlier this summer:
I summarized the company’s smart glasses product-announcement cadence up to that point. The first-generation Stories introduced in September 2020:
was, I wrote, “fundamentally a content capture and playback device (plus a fancy Bluetooth headset to a wirelessly tethered smartphone), containing an integrated still and video camera, stereo speakers, and a three-microphone (for ambient noise suppression purposes) array.”
The second-generation AI Glasses unveil was led three-plus years later in October 2023, which I own—two sets of, in fact, both Transitions-lens equipped:
make advancements on these fundamental fronts…They’re also now moisture (albeit not dust) resistant, with an IPX4 rating, for example. But the key advancement, at least to this “tech-head”, is their revolutionary AI-powered “smarts” (therefore the product name), enabled by the combo of Qualcomm’s Snapdragon AR1 Gen 1, Meta’s deep learning models running both resident and in the “cloud”, and speedy bidirectional glasses/cloud connectivity. AI features include real-time language Live Translation plus AI View, which visually identifies and audibly provides additional information about objects around the wearer.
And back in June (when published, written early May), I was already teasing what was to come:
Next-gen glasses due later this year will supposedly also integrate diminutive displays.
More recently, on June 20 (just three days before my earlier coverage had appeared in EDN, in fact), Meta and EssilorLuxottica released the sports-styled, Oakley-branded HTSN new member of the AI Glasses product line:
The battery life was nearly 2x longer: up eight hours under typical use, and 19 hours in standby. They charged up to 50% in only 20 minutes. The battery case now delivered up to 48 operating hours’ worth of charging capacity, versus 36 previously. The camera, still located in the left endpiece, now captured up to 3K resolution video (albeit the same 12 Mpixel still images as previously). And the price tag was also boosted: $499 for the initial limited-edition version, followed by more mainstream $399 variants.
A precursor retrofit and sports-tailored expansion
Fast forward to last week, and the most modest news coming from the partnership is that the Oakley HTSN enhancements have been retrofitted to the Ray-Ban styles, with one further improvement: 1080p video can now be captured at up to 60 fps in the Gen 2 versions. Cosmetically, they look unchanged from the Gen 1 precursors. And speaking of looks, trust me when I tell you that I don’t look nearly as cool as any of these folks do when donning them:
Meta and EssilorLuxottica have also expanded the Oakley-branded AI Glasses series beyond the initial HTSN style to the Vanguard line, in the process moving the camera above the nosepiece, otherwise sticking with the same bill-of-materials list, therefore specs, as the Ray-Ban Gen 2s:
And all of these, including a welcome retrofit to the Gen 1 Ray-Ban AI Glasses I own, will support a coming-soon new feature called conversation focus, which “uses the glasses’ open-ear speakers to amplify the voice of the person you’re talking to, helping distinguish it from ambient background noise in cafes and restaurants, parks, and other busy places.”
AI on display
And finally, what you’ve all been waiting for: the newest, priciest (starting at $799) Meta Ray-Ban Display model:
Unlike last year’s Orion prototype, they’re not full AR; the display area is restricted to a 600×600 resolution, 30 Hz refresh rate, 20-degree lower-right portion of the right eyepiece. But with 42 pixels per degree (PPD) of density, it’s still capable of rendering crisp, albeit terse information; keep in mind how close to the user’s right eyeball it is. And thanks to its coupling to Transitions lenses, early reviewer feedback suggests that it’s discernible even in bright sunlight.
Equally interesting is its interface scheme. While I assume that you can still control them using your voice, this time Meta and EssilorLuxottica have transitioned away from the right-arm touchpad and instead to a gesture-discerning wristband (which comes in two color options):
based on very cool (IMHO) surface EMG (electromyography) technology:
Again, the initial reviewer feedback that I’ve seen has been overwhelmingly positive. I’m guessing that at least in this case (Meta’s press release makes it clear that Orion-style full AR glasses with two-hand gesture interface support are still under active development), the company went with the wristband approach both because it’s more discreet in use and to optimize battery life. An always-active front camera, after all, would clobber battery life well beyond what the display already seemingly does; Meta claims six hours of “mixed-use” () between-charges operating life for the glasses themselves, and 18 hours for the band.
The guts, and what’s (in all this for Meta)
Longstanding silicon-supplier partner Qualcomm was notably quieter than usual from an announcement standpoint last week. Back in June, it had unveiled the Snapdragon AR1+ Gen 1 Platform, which may very well be the chipset foundation of the display-less devices launched last week. Then again, given that the aforementioned operating life and video-capture quality advancements versus their precursor (running the Snapdragon AR1) are comparatively modest, they may result mostly-to-solely from beefier integrated batteries and software optimizations.
The Meta Ray-Ban Display, on the other hand, is more likely to be powered by a next-generation chipset, whether from Qualcomm—the Snapdragon AR1+ Gen 1 or perhaps even one of the company’s higher-end Snapdragon XR platforms—or another supplier. We’ll need to wait for the inevitable teardown-to-come (at $799, not from yours truly!) to know for sure. Hardware advancements aside, I’m actually equally excited (as will undoubtedly also be the software developers out there among my readership) to hear what Meta unveiled on day 2: a “Wearables Device Access Toolkit” now available as a limited developer preview, with a broader rollout planned for next year.
Pending more robust third-party app support neatly leads into my closing topic: what’s in all of this for Meta? The company has clearly grown beyond its Facebook origin and foundation, although it’s still fundamentally motivated to cultivate a community that interacts and otherwise “lives” on its social media platform. AI-augmented smart glasses are just another camera-plus-microphones-and-speakers (and now, display) onramp to that platform. It’ll be interesting to see both how Meta’s existing onramps continue to evolve and what else might come next from a more revolutionary standpoint. Share your guesses in the comments!
p.s…I’m not at all motivated to give Meta any grief whatsoever for the two live-demo glitches that happened during the keynote, given that the alternative is a far less palatable fully-pre-recorded “sanitary” video approach. What I did find interesting, however, were the root causes of the glitches; an obscure, sequence-of-events driven software bug not encountered previously as well as a local server overload fueled by the large number of AI Glasses in the audience (a phenomenon not encountered during the comparatively empty-venue preparatory dress rehearsals). Who would have thought that a bunch of smart glasses would result in a DDoS?
—Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.
Related Content
- Smart glasses skepticism: A look at their past, present, and future(?)
- Ray-Ban Meta’s AI glasses: A transparency-enabled pseudo-teardown analysis
- Apple’s Spring 2024: In-person announcements no more?
The post Meta Connect 2025: VR still underwhelms; will smart glasses alternatively thrive? appeared first on EDN.