The release of Apple’s new mixed reality headset, Vision Pro, could cause a seismic shift in how users experience the metaverse, with developers potentially moving away from the absolute isolation of virtual reality.
Unlike today’s virtual reality headsets, which focus on total immersion, Apple’s Vision Pro — unveiled on June 5 — can also overlay apps onto the real world, allowing users “to interact with digital content in a way that makes it feel like it’s physically present in their space.”
Apple’s AR headset unveiled – Apple Vision Pro pic.twitter.com/UpNM7cH5yL
—Wario64 (@Wario64) June 5, 2023
Speaking to Cointelegraph, Alyse Su, Head of Metaverse at KPMG, believes Vision Pro will distract developers from purely immersive virtual worlds.
The headset introduces a new technology it calls “EyeSight,” which uses lens trickery to make the user’s facial expressions more natural to strangers. EyeSight also allows the display to switch between a transparent and opaque view, depending on whether a user is consuming immersive content or interacting with people in the real world.
“With traditional or other helmets, there is this barrier between people who wear it and people who don’t. It feels like you’re in two different worlds,” she said. “Now there are very few barriers between people, so you can have relatively smooth interactions.”
Apple EyeSight lets people see your eyes when using the headset pic.twitter.com/p773ZPjwRZ
—Dexerto (@Dexerto) June 5, 2023
Su said there’s also a lot of potential in its eye-tracking technology, which can be used to help create personalized experiences.
Apple’s pupil tracking technology works by detecting the mental state of users based on data from their eye movements and the response of their pupils to the stimulus. It then uses artificial intelligence to make predictions about their emotions.
I have spent 10% of my life contributing to the development of #VisionPro when I worked at Apple as a neurotechnology prototyping researcher in the technology development group. It’s the longest I’ve ever worked on a single effort. I am proud and relieved that it is finally… pic.twitter.com/vCdlmiZ5Vm
— Sterling Crispin ️ (@sterlingcrispin) June 5, 2023
“They have incorporated a lot of neuroscience or neurotechnology research into this helmet. The most overlooked part is the predictive pupil dilation tracking technology, which is based on their years of neurological research,” Su said.
Su predicted that the Vision Pro will steer developers toward using “emerging fields such as neuroscience and generative AI to create more personalized and predictive experiences.”
Peter Xing, the founder of blockchain-based project Transhuman Coin, also praised the headset’s design for “fitting into the natural way we interact as humans” and pointed to its unique eye-tracking capabilities like l one of the biggest leaps forward for the metaverse.
“By detecting pupil dilation, the headset acts as a proto-brain-computer interface to pick up when a user is expecting something to be selected to anticipate what they are thinking.”
Related: Animoca still bullish on blockchain games, awaits license for metaverse fund
When asked if the Vision Pro could put a spring back in the step of a struggling metaverse industry – which has seen nearly all blockchain-based virtual worlds suffer losses of over 90% in their native tokens – Xing was not too optimistic, at least not in the short term.
He explained that Apple is highly unlikely to encourage decentralized approaches that could threaten its “lucrative walled garden.”
While he and many others have noted the obvious lack of gaming focus in the product release, Xing thinks Apple’s recent partnership between Disney and Marvel could see a source of games and other interactive experiences. integrated.
Xing thinks this is exactly what the Metaverse needs to transition from the “player-centric world” to the mainstream.
AI eye: Hilarious AI travel booking, 3 weird uses for ChatGPT, crypto plugins