Health and Wellness > Consumer Products Promotion
AREA 23, AN FCB HEALTH NETWORK COMPANY, New York / WAVIO / 2019
Overview
Credits
Why is this work relevant for Creative Data?
Everyone should be aware of the things that are happening in their home. Yet, for the millions of people living with disabling hearing loss, this still isn’t the case. Recognizing this unmet need, we invented a visual way to significantly change the day-to-day lives of the Deaf and hard of hearing.
See Sound is the world’s first smart home hearing system for the Deaf. When a noise occurs in the home, See Sound verifies the noise against our data library and translates the sound into a simple visual readout. Now, the Deaf can finally see sound.
Describe any restrictions or regulations regarding Health & Wellness communications in your country/region including:
According to the US Federal Trade Commission, advertisements must be truthful, cannot be deceptive or unfair, and must be evidence-based. In addition, the Food and Drug Administration advises that disease-related advertisements/communications be disease or health condition-specific, enhance education, be clear and accurate, and contain a responsible public health message.
Health & Wellness work must demonstrate how it meets the criteria 'life-changing creativity'. Why is your work relevant for Health & Wellness?
See Sound is the world’s first smart home hearing system for the Deaf.
Wavio is a Deaf-led health technology company focused wholly on creating accessibility solutions for the deaf. Collaborating with the National Association for the Deaf, FCC’s Disability Rights Office, and the American School for the Deaf, we hope to bring SeeSound to deaf homes across the country. This idea gives the Deaf community a renewed sense of freedom and control in their home environments while making them safer and more connected than ever. Finally, a smart device that helps the Deaf see sound.
Background
The current products on the market for the Deaf and hard of hearing are extremely limited. Assistive hearing devices—such as cochlear implants—help indicate or amplify sound, but do not assign meaning to what’s heard. Other visual cues—such as flashing lights—help to notify of a doorbell, incoming phone call, or fire alarm, but are only single use.
There is no product available that can differentiate sounds. This leaves consumers with a handful of single-sound devices, but nothing that can distinguish a microwave from a baby crying, from glass breaking, and so on.
The limitations of inventing a product like this has been the lack of data. It would take literally millions of sound samples to create a machine-learning model that could report with any level of confidence. And that's exactly the problem we solved with See Sound, sourcing sound data from 2 million YouTube audio clips.
Describe the idea/data solution
People who are Deaf and hard of hearing lack situational awareness. It’s the innate ability to know what’s going on around you, and that’s largely driven by hearing. It’s easy to take for granted, but not being able to access sound data is disorienting and could mean life or death for the millions of people living with disabling hearing loss.
See Sound is the world’s first smart home hearing system for the Deaf. Simply plug the See Sound unit(s) directly into your wall outlet and connect via WiFi to the app on your phone. The more units the more data captured throughout the home.
Since Deaf people rely on their vision, we created a simple visual experience for them. When a sound occurs, the closest See Sound interprets the data via the AI-learning model and makes a prediction based on its confidence level, visualizing the sound data—like duration, frequency, and location—for users on their smart devices.
This product gives the Deaf community a renewed sense of freedom and control in their homes by finally enabling them to see sound.
Describe the data driven strategy
Five percent of the world’s population suffers from disabling hearing loss, which includes the Deaf and hard of hearing. While everyone should be aware of the things that are happening in their home, this still isn’t the case for this population. It’s a lack of situational awareness and means that when you’re deaf, if you don’t see it—it’s as if it never happened.
Since the Deaf couldn’t hear the sounds in their home, we had to find a way for them to access sound in a visual way. So we invented See Sound to interpret household sound data and deliver visual alerts directly to users’ mobile devices.
See Sound is an AI-learning model powered by millions of sound samples from YouTube that allow it to report sound data with a high level of accuracy. During this large-scale data aggregation by Google, over 2 million YouTube audio clips were analyzed, categorized, and converted into 10-second sound clips—each containing a single, discrete sound. Each type of sound in our data model is composed of data from several thousand YouTube audio samples—all adding up to a massive library of data that is accessed by our device and visualized for users.
Describe the creative use of data, or how the data enhanced the creative output
See Sound is powered by a machine-learning model which was trained by sound data from over 2 million YouTube audio clips. The way it works is that during training, the model takes the labeled sound clips and converts them into a set of numbers, known as tensors. It then uses an algorithm to farm these tensors looking for patterns between similarly labeled sounds. Once it finds similarities, it catalogs the patterns to differentiate one sound type from another. The model is then able to continue making these assessments even when it encounters new, unlabeled sounds.
One exciting output of this process is the visualization of sound waves for the user in the app. By exposing Deaf users to the sound waves associated with each sound type, users can start to recognize the sound based on the wave alone. In the same way our model interprets data, you can recognize consistent patterns in the waves, like frequency and intensity. For users who have never been able to hear a sound and connect it to an event, viewing these sound waves could be a way to learn how to differentiate between different types of household noises.
List the data driven results
Having earned 3 patents and invested $160,000+ over the last 4 years, we’re ready to launch See Sound worldwide. We are petitioning local, state, and federal US governments to introduce this product under the Americans with Disabilities Act’s (ADA’s) scope of accessible technology, which would allow for the government to cover the cost of distributing the See Sound device.
Our pricing model is still proprietary, but we anticipate being similar to Google Home or Amazon Echo. If See Sound was placed in just 5% of Deaf homes in the US (9 million homes), we estimate approximately 450,000 units delivered. If we achieve our goal of being covered under the ADA, we forecast placement in 50% of homes, computing to 4.5M units.
See Sound is poised to drastically change the way Deaf people understand and interact with sounds. For example, the more users visualize sound waves and see sound reports, the easier they can connect a spike in volume to a particular household occurrence.
At its core, sound is just data, and we allow the Deaf to visualize that data in their homes, empowering them to lead more independent lives. Finally, a smart device that helps the Deaf see sound.
More Entries from Health & Wellness Tech in Health and Wellness
24 items
More Entries from AREA 23, AN FCB HEALTH NETWORK COMPANY
24 items