Cannes Lions

VR/AR Emotion & Behavior Measurement and Analytics platform

ISOBAR, New York / undefined / 2018

Presentation Image
Case Film
1 of 0 items

Overview

Entries

Credits

OVERVIEW

Description

We noticed consumers were waiting to buy VR headsets because, first, they’re expensive and second, VR content was scarce and of relatively low quality. This presented a giant “chicken and egg” problem for the industry: brands unwilling to invest because there aren’t enough consumers and consumers unwilling to commit because the content was lacking. This is where we saw our opportunity. The intention of this innovation was to solve the “chicken and egg” problem by allowing brands to measure the ROI of their VR/AR/MR experiences to ensure that they will engage consumers and get the desired results. Minimally, this gives brands an opportunity to validate their investments in extended reality. However, the bigger opportunity is to use this innovation to iteratively optimize extended reality content in order to elicit the desired emotional responses at the right points, and for the right durations and intensities, within the narrative arc.

Execution

The platform combines Isobar’s psychophysiological testing capabilities, its proprietary motivational/emotional measurement tool, MindSight®, and its behavioral tracking system into a holistic, objective system that measures human emotional response to extended reality experiences. These tools allow us to measure emotional states in real time, using two measures of emotional valence: electroencephalography and facial electromyography; two measures of emotional arousal: electrodermal activity and heart rate variability; and two measures of attentional focus: eye tracking and behavioral tracking. This data is filtered, processed, and run through ML algorithms that classify emotions every second.

The platform is designed to work across various types of interactive and passive content. Because there is so much that is unknown around the value of VR/AR experiences, we have focused much of our ongoing R&D and accelerated development on integration with both Unity and Unreal Engine.

We can see changes in arousal, heart rates, and in subjective positivity or negativity in the tiny electrical impulses of the facial muscle contractions and frontal hemispheric brainwave activity. Brands that invest in emerging technology now can have precise and sensitive methods for measuring stumbling blocks to successful activation without depending on self-reports, cognitive effort, or users’ willingness to give an honest answer.

Outcome

The long-term outcome is to go beyond the biometric and behavior measurement that leads to our ability to understand the emotional state of the users. We have started development on integration with an ML back-end that will allow us to rapidly identify the users’ emotional state through AI processing of the captured biometric and behavior data produced. It is also used to optimize development of VR experiences for a range of clients who are finding opportunities for enhancing experiences by eliminating sources of frustration and confusion, and by adding emotionally satisfying rewards at points where engagement would begin to flag. We have been hired by a variety of clients in entertainment, retail, and healthcare to conduct project-based evaluations and optimizations. Our parent company is investing strongly in the development of a scalable version of Isobar’s system that leverages HMDs with integrated biometrics (EMG, EDA, PPG ) (option adding EEG).