Realeyes patented AI provides a natural, frictionless way to measure attention and emotional response to video content.
“By 2024, AI identification of emotions will influence more than half of the online advertisements you see.”
Face detection • Facial Action Units • 3D head pose • Attention: Capture, Retain & Encode • 7 Basic emotions • 3 Proprietary Emotions • 150 Viewers per video
Non-exaggerated natural reactions • Complete in-the-wild environment • Video watching in mobile and desktop web browsers • Precision over speed
Performance: Quality Score • Attention Metrics: Capture, Retain and Encode • Emotional Key Moments • Emotions by Duration • Emotions by Video Segments • Emotions by Demographic
Both people and machines recognise emotions in the same way; separating background with the foreground, the ability to focus – and detect the face and shape of its expression.
Not to be confused with facial recognition – we developed our own face detector which is more accurate and reliable than the Viola-Jones detector, generally considered the industry standard.
Once the region of the face has been detected, we need to be able to identify the landmark points of the face: the nose, the eyes, the mouth and the eyebrows.
Using all the frames in respondent’s video, we create a person-dependent baseline, or mean face shape. By measuring their expressions as they deviate from their ‘neutral’ face, accounts for people who naturally look more positive or negative and so correcting any bias.
To review the accuracy of our motions, we plot the aggregate output of our metrics against ground truth data – the majority voting of 5 human annotators who notate each frame.
Through machine learning and testing tens of thousands of videos, each with a minimum of 150 viewers per video, accuracy of attention and emotional measurement improves continuously, interpreting responses just like a human.
We help brands, publishers and agencies to move fast and efficiently, to maximize their ROI on ad attentiveness.
Get latest news and updates