- Join BJ's Wholesale Club for just $20 right now to save on holiday shopping
- Two free ways to get a Perplexity Pro subscription for one year
- The 40+ best Black Friday PlayStation 5 deals 2024: Deals available now
- The 25+ best Black Friday Nintendo Switch deals 2024
- Why there could be a new AI chatbot champ by the time you read this
I tried Meta's Horizon Hyperscape demo: Welcome to the metaverse's first holodeck
Last week at Meta Connect 2024, Mark Zuckerberg announced Hyperscape, a new tool for exploring real-world spaces in VR. You can download the software in beta form from the Meta Horizon store, so naturally, I took it for a spin.
Photogrammetry is nothing new. In fact, the term, then dubbed “photometrographie” in German, was described in an architectural publication as far back as 1867. The term (which comprises the morphemes photo, gram, and metry) describes light, recording, and measurement. Essentially, photogrammetry is measuring and recording from photographs.
Also: Meta takes some big AI swings at Meta Connect 2024
Hyperscape takes photogrammetry into the virtual world in a rather impressive, if somewhat limited way. Zuckerberg described the tool as a way of using a smartphone to scan a space and then turn it into something you could move around in within the metaverse, sort of like a rudimentary holodeck.
Although the smartphone scanning capability was not demonstrated, the Hyperscape app currently on the Horizon store does demonstrate six environments.
When you first enter a space, you’re shown how the controllers work. As you can see, you can rotate, change scenes (which is the point of view of the photogrammetry environment), and teleport (jump to a new location).
I first jumped into Daniel Arsham’s studio (because…cars) and looked around. Here’s a video-captured version of what I saw in my Quest 3.
It does feel considerably more real inside the Quest 3 headset (this will also work in the new Quest 3S as well). The resolution of the Quest 3 isn’t nearly as high as that of the seven-times-more-expensive Apple Vision Pro, and this is one application where that lower resolution is apparent. There’s a lot of obvious aliasing and rough edges. Even so, the demo was impressive.
Also: In a surprise twist, Meta is suddenly crushing Apple in the innovation battle
You can point the controller to a different location with the environment and teleport to that location.
It was a little disconcerting to land inside the back left quarter panel of the vehicle, but you can see how far I jumped and how my perspective changed.
Safety tip for Dr. Emory Erickson in Star Trek: When you start to design the first transporter, keep in mind that folks won’t like to materialize inside of places like cars and walls. Even in VR, it’s uncomfortable.
I briefly popped into Rebecca Fox’s studio in San Francisco, but I found it a tad claustrophobic.
Then I popped into what Facebook called its Aquarium conference room at its headquarters. According to the placard shown upon entering the virtual space, this was once Zuckerberg’s office. That, and apparently, some Facebook employees had way too much time on their hands.
What does it all mean?
Why does this app and technology exist? Why should we care?
Here are the key takeaways I think are relevant in this context:
- We’re seeing a technology showcase. It’s good to see demos of what can be done with new technology because they help spark interest and drive innovation.
- The fact that these environments were created by smartphone scans implies that this level of photogrammetric detail can be highly accessible to regular users.
- The ability of low-cost headsets to stream such detail also democratizes access, meaning that many more users can have access to such environments.
- The educational and sales potential (particularly in real estate) for this technology is huge. Sure, it’s not as great as being there. But it does give a much more visceral perspective of what a location is like.
- I can also see this used in some fun point-and-click games, where you actually move around a real space to uncover clues and explore a plot.
- Spaces could also be released as public domain or open source. Imagine what could be done with a “clip space” of the Oval Office or the inside of Independence Hall.
Think about this as technology, not really as an app. At some point, it will become another tool for VR designers to use when building their applications. And, as the technology gets better and better, we may see something that’s more in keeping with the holodeck experiences we saw in the science fiction of Star Trek.
Also: 4 exciting Ray-Ban smart glasses features Meta just announced at Connect 2024
What do you think? Have you tried out the Hyperscape app? Do you have a Quest 3? Do you plan on ordering a Quest 3S? Let us know in the comments below.
You can follow my day-to-day project updates on social media. Be sure to subscribe to my weekly update newsletter, and follow me on Twitter/X at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.