Creative Data > Creative Data

GIVE A HAND - BUILDING THE WORLD’S LARGEST OPEN-SOURCE IMAGE LIBRARY OF HANDS

HELLO MONDAY/DEPT®, Aarhus / AMERICAN SOCIETY FOR DEAF CHILDREN / 2023

Awards:

Shortlisted Eurobest
CampaignCampaignLayout(opens in a new tab)
Presentation Image
Case Film

Overview

Credits

Overview

Why is this work relevant for Creative Data?

GiveAHand.ai was developed alongside the American Society for Deaf Children, to create an extensive dataset of hands, fully tagged with 3D keypoints.

With current technology we are able to detect simple hand gestures - but to be able to identify the full spectrum of Sign Language we need better hand and finger detection.

Hands have amazingly complex geometry, so AI needs to collate multiple different shapes and arrangements, the more diverse and varied hands/data, the better. The database forms the foundation for researchers to improve machine learning to build better digital language tools for deaf and hearing impaired people.

Background

Technology has revolutionized the way we communicate, breaking down barriers across languages and enabling greater connection and understanding, ultimately leading to more inclusion and greater accessibility.

Live audio transcription and translation tools have limitations for the deaf and hard-of-hearing due to sign language's complex combination of fast-paced hand gestures, facial expressions, and full body movements. While machine learning models can handle facial expressions and body movements, detecting hand and finger movements remains a challenge.

With the current technology we are able to detect simple hand gestures - but in order to be able to understand the full spectrum of Sign Language we need better hand and finger detection.

Please provide any cultural context that would help the jury understand any cultural, national or regional nuances applicable to this work e.g. local legislation, cultural norms, a national holiday or religious festival that may have a particular meaning.

The WHO revealed that 466 million people were suffering from hearing loss in 2019, which

amount to 5% of the world population with 432 million (or 83%) of them being adults, and 34 million (17%) of them are children. The WHO also estimated that the number would double (i.e. 900 million people) by 2050.

Today, 90 to 95% of deaf children are born to hearing parents who often don’t know sign language and therefore will likely struggle to teach it before their children enter school. Even among school-aged deaf children, it’s estimated that at most 40% of families use sign language at home. Given this data, it is evident that a majority of deaf children are still deprived of language.

There is a need to advance sign language and create greater accessibility, equal to the language tools that are available in spoken languages.

Describe the Creative idea / data solution

How do we make sure that the deaf and hard of hearing gets equal access to language tools?

And how can we help accelerate the development of better hand and finger detection - so we, in the end, can create these tools?

Launched to celebrate the American Sign Language Day (April 15th), GiveAHand.ai is using tech for good. One hundred percent crowdsourced, the data collected in the platform will generate a diverse dataset of hands: diverse shapes, colors, backgrounds and gestures.

So that, anyone from anywhere can put their hands to good use, by contributing and uploading images, helping to build an image library that will help unlock sign language. Researchers can then download and use these fully tagged images to improve their machine learning models that will allow the detection and translation of the full spectrum of Sign Language.

Describe the data driven strategy

GiveAHand.ai was developed alongside the American Society for Deaf Children, to create an extensive dataset of hands, fully tagged with 3D keypoints.

Each hand ‘donated’ to the database becomes another data point. By creating one of the world’s biggest open-source dataset of hands, we are moving closer to finding ways to enhance communication for those who often feel misunderstood. Since we have reached an average of one hand per visitor, we know that giving a hand is not a big ask, but collectively this dataset can remove a big obstacle in research.

Describe the creative use of data, or how the data enhanced the creative output

GiveAHand.ai entirely relies on crowdsourcing for data collection, which results in a comprehensive dataset that covers a wide range of hand-related attributes.The data collected in the platform generates a diverse dataset of hands: diverse shapes, colors, backgrounds and gestures.

List the data driven results

We launched on American Sign Language day - 15th of April, and despite having no media budget, we have reached conversions far beyond what we projected. On average each visitor contributed a hand, which means 1 visitor equals 1 hand. And, within the first 5 days we are already ⅓ of the way towards becoming the world’s largest fully tagged, and open, dataset for finger and hand detection.

AI researchers are already looking for ways where they can use this data to improve their current machine learning models.

As the dataset grows, so does the opportunity for creating better hand models, and to create a sustainable solution to overcoming the sign language barrier for good.

More Entries from Creative Data Collection & Research in Creative Data

24 items

Grand Prix Cannes Lions
THE ARTOIS PROBABILITY

Data-enhanced Creativity

THE ARTOIS PROBABILITY

Anheuser-Busch Inbev, Stella Artois, GUT

(opens in a new tab)

More Entries from HELLO MONDAY/DEPT®

10 items

Shortlisted Cannes Lions
GIVE A HAND

Technological Achievement in Digital Craft

GIVE A HAND

HELLO MONDAY/DEPT®, HELLO MONDAY/DEPT®

(opens in a new tab)