Innovation > Innovation

EYEDAR

AREA 23, AN IPG HEALTH NETWORK COMPANY, New York / HORIZON THERAPEUTICS / 2022

Awards:

Shortlisted Cannes Lions
CampaignCampaign(opens in a new tab)
Supporting Images
Supporting Images
Supporting Images

Overview

Credits

Overview

Why is this work relevant for Innovation?

Eyedar is the first app that teaches the blind to visualize their world with sound. It was inspired by echolocation—a form of sensory substitution where the blind use sound to visualize their surroundings.

Eyedar digitizes the principles of echolocation, making it an accessible and learnable skill. By leveraging the newly available LiDAR 3-D technology in iPhones, Eyedar maps a user’s environment and translates it into 3-D audio data, allowing the blind to visualize their surroundings and, over time, create a clearer picture of the world around them.

Background

Eyedar is based on the principles of echolocation. Like bats or dolphins, humans who echolocate emit a sound that echoes around their surroundings, allowing them to build a spatial map to visualize their environment. MRI scans confirmed that echolocation activates the visual cortex in the brain, showing that, neurologically, echolocators could ‘see.’

Despite the benefits echolocation, is not widely accessible or taught and is used by just 1% of the blind population.

In 2018, Dr. Yi Yang at Penn State studied LIDAR (a 3-D mapping technology) to aid visualization for the blind. The study revealed enormous potential; but at the time, a LiDAR scanner cost $70,000 and was the size of a shoebox.

Fortunately, recent iPhone updates added LiDAR scanners as standard hardware, making what was previously inaccessible widely available.

Combining this technology with the fundamentals of echolocation, we developed Eyedar to bring this life-changing skill to the world.

Describe the idea

Echolocation is a powerful tool the blind can use to visualize the world around them. By making clicking sounds which bounce off surfaces and return to the ear, they can ‘see’ their surroundings. However, only 1% of the blind have learned this skill.

We wanted to make visualizing more accessible for the blind. With LiDAR now onboard many iPhones, now we can.

Eyedar uses the phone’s LiDAR scanner to create a map of a user’s environment, then translates this data into a 3-D soundscape. Changes in pitch, volume, and spatial sound convey information on the size, shape, distance, and direction of objects.

Eyedar provides sequential training, beginning with basic sound and obstacle recognition and progressively challenging users to visualize more complex soundscapes and navigate with greater confidence. Over time, the process is designed to become second nature, allowing users to build a clearer picture of the world around them.

What were the key dates in the development process?

Horizon is a healthcare company that manufactures multiple disease treatments, including treatments for two diseases that can cause blindness. In addition to their commitment to preventing vision loss, Horizon sponsors and supports accessibility solutions for anyone who is blind or visually impaired.

We knew that echolocation could allow the blind to visualize with sound, allowing them to navigate with greater confidence. But only 1% of the blind community practice this skill because it’s not widely taught.

We developed Eyedar over the course of 4 years to make echolocation a learnable and accessible skill for the entire global blind community:

September 2018

Completion of study by our technology consultant, Dr. Yi Yang, professor at Pennsylvania State University, whose research served as our proof-of-concept for LiDAR-based echolocation and visualization. tinyurl.com/3t8ep68k

October 2020

LiDAR became a standard hardware feature of new iPhones. The potential for an app like Eyedar to make echolocation-like skills realistically accessible came to life.

September 2021

Initial experiments with LiDAR soundscape translation in iOS.

October 2021-December 2021

iOS app build out. App prototypes were developed and tested. Eyedar was designed exclusively for the blind by a blind-led UX team. As such, it contains no visual interface and is guided exclusively by voiceover and gestures (universal best practices in UX design for the blind).

January 2022-Ongoing

Alpha and beta testing took place in the first quarter of 2022 with select members of the blind community using the app in test flight. Their feedback was crucial to the app refinement and optimization.

April 2022

Eyedar launched on the app store.

April 2022

The Eyedar codebase was released under an MIT open-source license on GitHub, making it available for continued innovation and customization for the blind community.

April 2022 and Beyond

Launching is only the beginning. Eyedar is currently compatible with iPhone Pro 12 and 13, but as LiDAR and other computer vision technology becomes standard on phones, it will become even more accessible.

Describe the innovation / technology

Eyedar emits a LiDAR signal, creating a 3-D-map of the user’s surroundings. A 100-point vertical line array sweeps from left to right across the LiDAR field, and each point is assigned a pitch tone from low to high, bottom to top. As the line sweeps the field, each point emits its assigned tone, with higher (close) or lower (distant) volume depending on proximity to the user. The result is, to the untrained ear, a lot of noise. But with practice and training, the mind becomes accustomed to the noise, patterns begin to emerge, and a clearer mental picture of the world is achievable.

For example, the floor, a chair on the floor, a wall, an open doorway, and a tree branch each emit their own sound patterns, which the user can learn to recognize and interact with using dynamic sound changes within the app.

Several prior fMRI studies have shown that with visual-audio sensory substitution, and with echolocation specifically, the brain’s visual cortex processes the information with top-down thinking that “fills in the blanks”. This is thought to explain the phenomenon reported in several prior studies of subjects having a “vision-like experience” using these techniques.

Describe the expectations / outcome

Eyedar has the potential to redefine what it means to be blind. Echolocation has already shown how it can positively impact the lives of those who use it, and the only impediment to widespread adoption is the accessibility of learning the skill. Eyedar removes that impediment.

Eyedar is available for free in the app store, and its open-sourced design allows for continued innovation and customization to best benefit the blind community. The app works with LiDAR-compatible phones, including the iPhone 12 and 13 (Pro and Max Pro). As LiDAR and other computer vision technology becomes standard on phones, it will become even more accessible. With about 50 million blind people in the world, Eyedar can reach an unprecedented proportion to make echolocation a learnable and accessible skill.

More Entries from Early Stage Technology in Innovation

24 items

Grand Prix Cannes Lions
ONE HOUSE TO SAVE MANY

Product Innovation

ONE HOUSE TO SAVE MANY

SUNCORP GROUP, LEO BURNETT

(opens in a new tab)

More Entries from AREA 23, AN IPG HEALTH NETWORK COMPANY

24 items

Shortlisted Cannes Lions
FREE KILLER TAN

Education & Awareness

FREE KILLER TAN

MOLLIE BIGGANE MELANOMA FOUNDATION (MOLLIES FUND), AREA 23

(opens in a new tab)