I wore Mark Zuckerberg’s contact lenses. The pair, which Meta’s Orion AR glasses team grabbed for me in a pinch — I wasn’t told I should bring contacts — were needed so I could try on Meta’s next big leap into AR, a moonshot pair of glasses codenamed Orion. Luckily, my eyes are a good match for Mark’s.
With the contact lenses popped in, my right wrist was fitted for a snug ribbed band that looked like a fitness tracker, but without a screen. Then I put on a wireless pair of thick black glasses and began calibrating my eyes to the hardware’s eye tracking with a small team looking at a nearby computer screen to monitor.
These moments all felt like a drift into a strange new future, even for me, having been immersed in AR and VR for years. That’s because Meta’s early prototype hardware, not remotely ready for the public yet, is a fusion of technologies I’ve seen in other places and some I’ve only barely tried at all. Still, everything also felt familiar.
Watch this: My First Dive Into Meta’s Orion AR Glasses And Neural Wristband
Just a few weeks ago, I wore a self-contained pair of AR glasses for developers made by Snap with a similar mission. Snap’s Spectacles are chunkier and have a more limited field of view. Meta’s are smaller, have a wide 70-degree field of view that impressed me, and have their own wireless neural wristband input. They also need a nearly phone-size external processor puck that wirelessly feeds apps and graphics to the glasses, roughly similar in size to the Apple Vision Pro‘s battery pack, but even smaller and lighter and without any tether. I’ve spoken to Meta’s executives about this type of technology for years, including at their Research Labs facilities a few years ago. Now, it was time for my test drive.
I spent about an hour demoing various experiences in Orion and talking to Meta’s team about what this new piece of tech means and where we might see it in any form next. At times, I felt like I was in a shrunken-down mixed reality headset and other times, I felt like I was wearing a more advanced vision of where Meta’s Ray-Bans will head. Orion is both, really.
A ton of sensors
One of Orion’s big tricks is it has a full set of sensors you’d normally find on larger mixed-reality headsets. Full eye-tracking cameras are onboard, hiding on the sides of the lenses. Outer cameras are tucked along the frame’s top edge. Side cameras for hand and room tracking lurk around the arms. There are speakers, microphones and Wi-Fi 6 for connecting via proprietary protocol with a separate processor puck that’s needed to power the glasses’ apps and graphics.
That “compute puck” has its own tracking cameras too, and a recessed touchpad. I’m told the puck was originally designed to act as its own controller for the glasses before a decision to emphasize hand-tracking and electromyography, but it looks like something that could step in as an alternative in the future.
I got a quick peek at a see-through model of the glasses exposing how much tech is studded throughout, a clever way for Meta to show off how densely engineered they are. Yet, they weigh only 100 grams (3.5 ounces). Big as they are, they fit comfortably on my face.
Wide-view AR
Orion has a 70-degree field of view, which may sound pretty small to any VR headset wearer. VR devices tend to have a field of view around 90 degrees or greater. Most other AR headsets have even smaller viewing areas for their pop-up AR displays. Snap’s new Spectacles, for instance, have a much smaller (and narrower) 46-degree FOV.
Meta’s glasses spread the viewing area out more, especially horizontally. I saw cut-offs where the displays seemed to end in my vision, but so far off to the sides that they generally — in the demos I had, at least — seemed invisible.
I’ve only seen an FOV this large once in a pair of lenses by AR optics company Lumus, which I tried at a conference earlier this year. Meta’s gotten to this larger view in a smaller frame by using silicon-carbide lenses and micro LED projectors. According to Meta’s Rahul Prasad, senior director of product management for AI and AR wearables, these can help with wider viewing angles at closer range and use diffractive waveguides to bend the light to hit my eyes without too much rainbow effect on the lenses.
The glasses definitely do not look like everyday things, but they at least approach something you might see someone wearing around. On me, they kind of pass as super, ultrathick arty frames. The lenses have a darker tint, it seems to me, but they look clear and crisp when I look through them. Like Snap’s Spectacles, I’m told they can auto-dim for better viewing, but I didn’t get to demo that.
One of the less ideal parts of these displays is their resolution: 13 pixels per degree, Meta says. Apps and videos and games still looked fine, though not quite as crisp as a Quest 3 headset (which has a resolution density of 25 pixels per degree). I did get a quick look at another version of Orion with even higher resolution projected displays, 26 pixels per degree, on which I watched a brief Jurassic Park movie clip. The goal is to get the resolution as high as possible before they’re released as an actual consumer product. The wider-view display means more pixels are needed to look good, which takes more power.
A wrist-worn neural interface
The wristband got me even more excited than the glasses. A few years ago, I saw Meta’s EMG neural input technology in prototype form at Meta’s Reality Labs Research but never got to try it myself: instead, I watched Mark Zuckerberg use it. This time, it was my turn.
The new band design is much smaller, more like an enhanced smartwatch strap. It snaps and locks onto my wrist with magnets and a clasp. The technology senses electrical impulses from the skin and translates those into interpreted actions. In effect, they allow complex gestures and subtle vibrating haptic feedback without needing to keep my hand in view of the glasses’ hand-tracking cameras. And the processing for the EMG sensor is all done on the band itself, with a promised full day of battery life.
Apple’s Vision Pro and the Meta Quest already have hand tracking, but these wrist gestures were a bit more versatile. Some were familiar pinches (to select buttons or apps), and others were little thumb pushes I did by moving my thumb across my balled-up fist (to scroll). I used my eyes to navigate to things, much like Apple’s Vision Pro, but Orion’s gestures could be done when my hand was completely resting down to my side.
The gestures weren’t perfect yet. Sometimes my pinches were misinterpreted, and I had to try again. The eye tracking, however, worked extremely well. Combined, I could see how this could be the next wave of interfaces after the Vision Pro. Unlock watches and wrist trackers with advanced gesture sensing tech, combine with eye tracking, and who knows what could happen.
How much more could this gesture-sensing band do? It’s unclear. The EMG tech in this band may make its way into other things before Orion becomes a full-fledged product. Alex Himel, Meta’s head of wearable tech, told me that EMG bands will emerge in other future products, as part of many devices in Meta’s expanding product future. Could one of those be a smartwatch, as has been discussed for years? EMG bands could be interesting with Meta’s existing Quest headsets and Ray-Ban glasses too.
AI and apps: Familiar and strange
I got a number of demos with the glasses on, many of them feeling like variations on mixed reality or AI experiences I’ve had to some degree on Quest and Meta Ray-Bans. A middle-finger pinch with my thumb with my hand up opens the app menu, like Vision Pro, and I can use my eyes to look at an app and tap my forefinger and thumb to open.
I browse an open YouTube video of Aaron Rodgers throwing touchdowns against the Patriots in a window in front of me, dragging it closer to my eyes (Meta knew its audience). An incoming call, which I answer, comes from one of Meta’s PR team outside, calling on her phone. I can see her video feed. I respond to a message that comes in on Messenger, dictating with my voice. Multiple windows are open side by side, showing off the larger viewing area.
I saw a few demos for AI: one where I just generate an image using Meta AI with my voice, and it’s familiar silly stuff (in 2D); another, with me standing in front of a bunch of grocery items on a table, inviting me to ask for a recipe for the items, and Meta AI — with familiar sound effects to Meta Ray-Bans — brings up a recipe and labels the items on the table with pop-up icons. I expect more complex interactive AR instructions, but not this time.
Then, I played a little starfighter game where I used my head to control the ship and my eyes and finger pinches to shoot targets. It was fun, but something I feel like I’ve seen on the Vision Pro. Another game demo had me play a two-player game across a table with someone, and required scanning a QR code to align the 3D Pong-style arcade space, which sprouted between us as we used our hands as paddles.
I also got an incoming call from another Meta employee, this one calling as a realistic 3D codec avatar (Meta’s name for its realistic avatar tech, which Apple calls Personas). Meta hasn’t made codec avatars available for its headsets yet, but maybe this is a sign they could be ready soon. The 3D head rendering and its emotions looked good: they felt pretty real.
None of these demos blew me away compared to the best mixed reality demos I’ve seen in the past. But experiencing them on such relatively compact glasses, with that wider field of view and the gesture control wristband in action, was pretty fascinating.
What happens next, though? Meta hasn’t pinned a date on when these glasses could see actual release, and Meta acknowledges that the design, the resolution and the price need to be improved. Meta says to expect a price close to a high-end phone or a laptop. That sounds like $1,000-plus, far above the cost of a Quest 3, but less than Apple’s Vision Pro. When that day comes, what will the app and AI landscape look like? Or mixed reality headsets, for that matter? Will this shrink down into a pair of glasses similar to Meta’s Ray-Bans, or is that even possible? And will gesture interfaces, through hand tracking, watches or even EMG bands, start to feel a lot more commonplace by then?
It’s impossible to know the answers now, in 2024. Popping Mark Zuckerberg’s contact lenses out of my eyes in the bathroom after my demo, and looking at myself in the mirror, I realize that I’ve demoed something pretty far-out. Things have come a long way since my first VR demos over a decade ago, but it’s clear that devices to come are going to shake up the mixed reality landscape all over again. In the meantime, maybe, VR headsets and smart glasses are going to keep evolving in steps until, perhaps, suddenly, we’ll just be there and it won’t feel strange at all.
link