Product Innovation

Create better products and software that is more human, personalized and empathetic

Unleash the power of Emotion AI

Embed real-time, in-context sensing and analytics into devices, mobile apps, streaming and interactive experiences

LG ‘OLED TV’
The twinfluencer experiment

To compare the experience of watching programmes on its new 2019 OLED TV with that of one of its older models, Realeyes and LG took part in a fun experiment involving twins and Game of Thrones at the tech giant’s testing facility in Weybridge.

 

Using webcams mounted on top of the TVs, our AI platform analysed in real time their facial expressions, body language and head movements to see which twin was more entertained.

The result? The twin watching on the 2019 model was more attentive and his emotions were more intense. 

Read Blog
Emotion Mirror
Reflecting the emotion at Cannes Lions festival

Mirror, mirror on the wall, who is the happiest of them all? That’s a question answered by Realeyes’ #EmotionMirror, which we first exhibited at Cannes Lions in 2014.

  The ‘mirror’, was created using large screens and front-facing cameras, provided festival-goers with a perfect demonstration of how facial coding works, matching the facial landmarks to people’s expressions.

Read Blog
TUI
Choosing Vacation by Emotion

The futuristic ‘Destination U’ prototype is a first of its kind, innovative way for holidaymakers to choose a trip that matches their emotional needs.

As video plays to trigger imaginations, facial coding and emotion measurement tap into the customer’s subconscious.

An algorithm computes every subtle facial response to a rapid series of evocative moving images of destinations and experiences, and uses that data to calculate a ‘perfect holiday’ prescription.

Case Study
Previous
Next

Enabling technology to adapt to human behavior

Our technology works with any device camera or webcam and the SDK has been designed to analyze spontaneous facial expressions in response to experiences.

Computer vision identifies key areas of the face, shown as landmarks on the eyes, eyebrows, nose, mouth etc. We’ve taught machines to map these landmark positions to different classifications of emotions and recognise high and low attention levels from the position of the face. 

Our patented technology has been trained and tested using our emotion data repository of millions of faces from across the world.

Software Development Kit

Our SDK can enable you to make your technology more human by embedding real time emotion sensing and measuring attention levels in to your customer experience.

Our SDK supports major platforms including iOS, Android, Windows, Linux, and macOS and is designed for developer ease-of-use, taking just a few hours to set-up to emotion enable your application.

Leading Computer Vision and Emotion AI for Technology and Platform Developers

3D modelling

allows for head movement

Tackles ‘in wild’ occlusions

(glasses, beards etc)

7 emotion classes

Happy, surprise, confusion, contempt, disgust, empathy, fear

Cultural display rules

world-wide annotations

Attention tracking

Most advanced on the market

3 proprietary metrics

Engagement, Negative, Valence

World’s richest emotion database

0 m
annotations
0 yrs
of footage
0 m
views
0 k+
videos tested
0 +
countries
0 bn
data points

Get in touch