Apple previewed new Vision Pro accessibility features that could enhance eyesight proxy when launching in visionOS later this year.
The update utilizes the headset’s main camera to magnify what users see and provide live, machine-learning-powered descriptions of surroundings.
The new magnification feature allows zooming in on virtual and real-world objects, offering hands-free assistance and accessibility through various apps.
Apple will release an API for developers to access the Vision Pro's camera, and introduce brain-computer interface (BCI) support in visionOS, iOS, and iPadOS for alternative input methods.