Innovation > Innovation

PROJECT CONVEY

180LA, Los Angeles / COX COMMUNICATIONS / 2022

Awards:

Shortlisted Cannes Lions
CampaignCampaign(opens in a new tab)
Presentation Image
Case Film

Overview

Credits

Overview

Why is this work relevant for Innovation?

Project Convey is an innovative solution designed to help those on the autism spectrum communicate more effectively on videochat. The prototype solves a core problem that has faced this underrepresented audience up until now - finding it hard to interpret other people’s emotions over video chat. Working with a unique combination of existing artificial intelligence tools and a number of customized AI models, we were able to do what has never been done before and create a prototype that brings speech-to-text, tone analysis and facial expression recognition together. The real-time processing element further enhances the innovative nature of the work.

Background

At a time when video chat use was at its highest because of the pandemic, those on the spectrum were having a harder time due to the typical digital environment of video chat platforms.

To solve this, we were inspired by a mixture of existing AI-tools and customized AI-models. Seeing them all work individually allowed us to imagine what could be possible if they worked together in a completely unique combination. This combination became the ingredients for a new video chat prototype that translates emotion into emojis, pulling real-time data from tone of voice, word selection, and facial recognition.

Limitations:

-1:1 conversations only

-Lighting conditions, users with glasses, background are variables that affect the accuracy

-Because we weren’t at a stage where accounts could be created, machine learning where the tool is optimized for your face specifically was not possible. When advancing the tool this would make a huge difference.

Describe the idea

Every day, millions of people connect through video chat. But for people on the spectrum, this connection is impossible, because many can’t read non-verbal cues, like facial expressions, which are key to connecting on a deeper level.

In response, Cox created Project Convey, a video chat prototype that helps people on the spectrum connect better with the person on the other side of the screen. Using AI, the prototype analyzes three inputs: facial expressions, speech, and tone, and in real-time, translates the emotion into an emoji. Cox worked with members of the autism community to design the tool, from UX and UI, to color, sound, and the emojis themselves. Those who trialed our prototype experienced more engaging and meaningful video calls than they had on other video chat platforms. The experience was not just improved for the person with ASD, the technology created a better experience for both parties.

What were the key dates in the development process?

November 2021 - December 2021: Initial discovery phase consisting of understanding the needs of the autistic community by conducting interviews to evaluate what would be useful to them. Also looked at UX/UI Design routes.

December 2021 - March 2022 - Development and testing phase.

March 31 2022 - Prototype released

Describe the innovation / technology

How it works: The user will log on to the video chat platform and a menu comes up. They will enter their name and choose if they are on the ASD spectrum or not. In the next step, they are able to make video calls. For those on the spectrum, the person on the other side of the screen's emotions will be translated to them through an emoji that morphs to match the user's emotion.

Components/platforms: We used a mixture of existing AI-tools and customized AI-models, brought together in a completely unique combination to create a new video chat prototype that translates emotion from what someone says through words and facial expressions. It takes that data and processes it in real-time, delivering a visual representation of that emotion in the form of an emoji on video chat. In terms of the process, Morphcast was used to recognize and interpret facial expressions. Sentiment AI was used for speech-to-text processing, and this information was then fed to IBM Watson to identify the emotion. We then used customized AI-models to analyze the tone as well.

Development stage: The tool is available via a web application and is currently in the prototype stage.

Describe the expectations / outcome

The purpose of the prototype was to make video chat more inclusive by helping those on the autism spectrum understand and make more meaningful connections on video chat. As far as we know this is the first communications tool to help this audience in their online social activities, marking an important milestone within the communications industry.

The aim of the project was to generate excitement around this technology, especially among relevant video chat companies which would benefit from this feature. The hope is that this project inspires others to make all video chat platforms more inclusive in the future.

More Entries from Early Stage Technology in Innovation

24 items

Grand Prix Cannes Lions
ONE HOUSE TO SAVE MANY

Product Innovation

ONE HOUSE TO SAVE MANY

SUNCORP GROUP, LEO BURNETT

(opens in a new tab)

More Entries from 180LA

24 items

Grand Prix Cannes Lions
BOOST YOUR VOICE

Integrated

BOOST YOUR VOICE

BOOST MOBILE, 180LA

(opens in a new tab)