Innovation > Innovation
ISOBAR, New York / undefined / 2018
Overview
Credits
CampaignDescription
We noticed consumers were waiting to buy VR headsets because, first, they’re expensive and second, VR content was scarce and of relatively low quality. This presented a giant “chicken and egg” problem for the industry: brands unwilling to invest because there aren’t enough consumers and consumers unwilling to commit because the content was lacking. This is where we saw our opportunity. The intention of this innovation was to solve the “chicken and egg” problem by allowing brands to measure the ROI of their VR/AR/MR experiences to ensure that they will engage consumers and get the desired results. Minimally, this gives brands an opportunity to validate their investments in extended reality. However, the bigger opportunity is to use this innovation to iteratively optimize extended reality content in order to elicit the desired emotional responses at the right points, and for the right durations and intensities, within the narrative arc.
Execution
The platform combines Isobar’s psychophysiological testing capabilities, its proprietary motivational/emotional measurement tool, MindSight®, and its behavioral tracking system into a holistic, objective system that measures human emotional response to extended reality experiences. These tools allow us to measure emotional states in real time, using two measures of emotional valence: electroencephalography and facial electromyography; two measures of emotional arousal: electrodermal activity and heart rate variability; and two measures of attentional focus: eye tracking and behavioral tracking. This data is filtered, processed, and run through ML algorithms that classify emotions every second.
The platform is designed to work across various types of interactive and passive content. Because there is so much that is unknown around the value of VR/AR experiences, we have focused much of our ongoing R&D and accelerated development on integration with both Unity and Unreal Engine.
We can see changes in arousal, heart rates, and in subjective positivity or negativity in the tiny electrical impulses of the facial muscle contractions and frontal hemispheric brainwave activity. Brands that invest in emerging technology now can have precise and sensitive methods for measuring stumbling blocks to successful activation without depending on self-reports, cognitive effort, or users’ willingness to give an honest answer.
Outcome
The long-term outcome is to go beyond the biometric and behavior measurement that leads to our ability to understand the emotional state of the users. We have started development on integration with an ML back-end that will allow us to rapidly identify the users’ emotional state through AI processing of the captured biometric and behavior data produced. It is also used to optimize development of VR experiences for a range of clients who are finding opportunities for enhancing experiences by eliminating sources of frustration and confusion, and by adding emotionally satisfying rewards at points where engagement would begin to flag. We have been hired by a variety of clients in entertainment, retail, and healthcare to conduct project-based evaluations and optimizations. Our parent company is investing strongly in the development of a scalable version of Isobar’s system that leverages HMDs with integrated biometrics (EMG, EDA, PPG ) (option adding EEG).
Relevancy
As emerging technologies continue to infiltrate our day-to-day lives, it is essential that brand marketers have the ability to measure the ROI of these new experiences. Our Mixed Reality (VR/AR) Measurement and Analytics platform allows for just that. Without the ability to know if a particular experience is engaging and effective, brands won’t have the motivation or desire to invest in innovative technologies. This platform is essential to the eventual creation of better and more engaging content and as a means to quantify the investment in mixed reality content experiences.
Solution
February 2017: Integration of EEG, EDA, PPG, and EMG sensors with HTC Vive HMD
March 2017: Integration of Pupil Labs sensors with HTC Vive HMD with game capture; latency testing; assemble 360 video reel to generate discrete emotions from a neutral starting point
April 2017: Establish UX testing protocol
May 2017: Continue testing to refine protocol, including saving outputs and video capture, definition of key metrics; validation using Mountain Dew Experience and Wyndham Experience
Week of June 12, 2017: System development announced at IIEX (Atlanta)
Week of September 11, 2017: YuMe/RLTY CHK (Kiss or Kill) VR experience testing (Boston)
Week of September 25, 2017: Unity/Lionsgate Films (Jigsaw) VR experience testing (Boston)
October 24, 2017: Kiss or Kill testing results released (San Francisco)
November 15, 2017: Jigsaw testing results released at Ad Age Next (NYC) and VRevolution (NYC)
Synopsis
In a collaboration with the MIT Media Lab, Isobar has developed a technique for capturing and analyzing in-the-moment behavioral and biometric indicators of emotional states created by virtual, augmented, and mixed reality experiences. Suddenly, brands have precise, sensitive methods that can be used to evaluate and optimize extended reality experiences. Companies working with us can know consumers better than consumers know themselves. Isobar married its knowledge of emotional analytics to research developed in collaboration with the MIT Media Lab’s Fluid Interfaces Group that captures users’ real-time behavior in room-scale VR. The result is a new platform that provides deep insights into how to measure the effectiveness of VR experiences for marketers.
More Entries from Innovative Technology in Innovation
24 items
More Entries from ISOBAR
24 items