In the absence of sight, the mind can use sound to spatial map its surroundings in a principle called echolocation.
Here’s how Eyedar works:
1. Eyedar uses the LiDAR technology in the iPhone 12 Pro/Max and the iPhone 13 Pro/Max to scan your surroundings and create a spatial map*.
2. Eyedar translates these maps into audio feedback to describe your surroundings. Changes in pitch and volume convey the size, shape, distance, and relationship of objects.
3. This audio feedback teaches the mind to visualize your surroundings in detail. A progressive training model improves this skill over time.
(*Eyedar is a self-voiced app that uses native VoiceOver gestures for navigation.)