- The Model Context Protocol: Simplifying Building AI apps with Anthropic Claude Desktop and Docker | Docker
- This robot vacuum and mop performs as well as some flagship models - but at half the price
- Finally, a ThinkPad model that checks all the boxes for me as a working professional
- Why I recommend this Android phone for kids over a cheap Samsung or Motorola model
- My favorite USB-C accessory of all time scores a magnetic upgrade
Meta's new AR glasses offer neural control – no implant necessary
The headline for Meta’s new fully functioning prototype, Orion (pronounced O-Ryan), basically writes itself.
They’re “the most advanced glasses the world has ever seen,” Meta CEO Mark Zuckerberg said during today’s Meta Connect event. That’s a bold claim but not one that many will quickly discredit. After all, Meta is coming into Connect hot, having seen success with last year’s Quest 3 and Ray-Ban smart glasses.
Also: Everything announced at Meta Connect 2024: Affordable Quest 3, AR glasses, and more
In some ways, Orion is the best of both worlds, supposedly offering mixed-reality-like computing similar to the Quest 3 in a light, normalized form factor akin to the Ray-Ban smart glasses. Zuckerberg set out five simple yet highly technical requirements when designing Orion:
- They can’t be a headset, meaning there should be no wires or cables dangling off of them, and they should weigh less than 100 grams
- The glasses need holographic displays with a wide field-of-view (FOV).
- The displays should be sharp enough to pick up details in the real world.
- The displays should be bright enough for visual overlays no matter what the environment is like.
- AR projections should be able to display a cinema-like screen or multiple monitors.
Following these principles means Orion applies holograms to your vision of reality instead of capturing and reimaging what’s in front of you, a process commonly known as pass-through. The big benefit of this technology is the reduced latency, if any.
Being able to visualize incoming messages, video call feeds, and other important information while still being attentive and present in reality solves one of the biggest social problems with modern-day VR headsets like the Quest 3 and Apple Vision Pro.
Meta says there are three ways to interact with Orion: using voice via Meta AI, hand and eye tracking, and a neural interface. The first two are rather straightforward, but the third option is exactly what’s needed to keep us grounded in prototype land. Orion can work in tandem with a wrist-worn neural interface, registering clicks, pinches, and thumb pushes as inputs.
Also: Meta Quest 3S unveiled at Connect 2024: Everything to know about the VR headset
For example, you can form a fist and brush your thumb on the surface to scroll the user interface, according to CNET’s Scott Stein. Meta says you’ll get a day’s worth of usage before needing to charge the wristband.
That’s promising to hear, considering I’d rather make finger gestures while walking around or sitting down than shout at an invisible voice assistant or wave my arms around in public. According to Meta, Orion runs on custom silicon and a set of sensors, with the battery tucked into the glasses’ arms.
While Orion gives us a glimpse of future AR glasses, there’s still a lot of work to be done before they’re consumer-ready, according to Zuckerberg. Tuning the display system to make it sharper, making the design smaller so it’s more fashionable, and affordability are all aspects that Meta’s CEO would like to develop further. Until it hits the open market, Orion will be available as a developer kit — mostly internally, to build out the software, as well as to a handful of external partners.
When it’s ready, it’ll be positioned as “Meta’s first consumer, fully-holographic AR glasses,” Zuckerberg said.