Share

Apple will let the Vision Pro ‘see’ for you

Apple previewed new Vision Pro accessibility features today that could turn the headset into a proxy for eyesight when they launch in visionOS later this year. The update uses the headset’s main camera to magnify what a user sees or to enable live, machine-learning-powered descriptions of surroundings.

The new magnification feature works on virtual objects as well as real-world ones. An example from Apple’s announcement shows a first-person view as a Vision Pro wearer goes from reading a zoomed-in view of a real-world recipe book to the Reminders app, also blown up to be easier to read. It’s a compelling use for the Vision Pro, freeing up a hand for anyone who might’ve done the same thing with comparable smartphone features.

Animation showing Apple’s new magnification feature in the Vision Pro.

Also as part of the update, Apple’s VoiceOver accessibility feature will “describe surroundings, find objects, read documents, and more” in visionOS.

The company will release an API to give approved developers access to the Vision Pro’s camera for accessibility apps. That could be used for things like “live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free.” It may not seem like much now, given the Vision Pro’s reportedly meager sales, but the features could be useful in rumored future Apple wearables, like camera-equipped AirPods or even new Apple-branded, Meta Ray-Ban-like smart glasses.

Finally, Apple says it’s adding a new protocol in visionOS, iOS, and iPadOS that supports brain-computer interfaces (BCI) using its Switch Control accessibility feature, which provides for alternate input methods such as controlling aspects of your phone using things like head movements captured by your iPhone’s camera. 

A Wall Street Journal report today explains that Apple worked on this new BCI standard with Synchron, a brain implant company that lets users select icons on a screen by thinking about them. The report notes that Synchon’s tech doesn’t enable things like mouse movement, which Elon Musk’s Neuralink has accomplished.

Source: https://www.theverge.com/news/665683/apple-vision-pro-accessibility-features-zoom-voiceover

Shop with us!