Create better products and software that is more human, personalized and empathetic
Embed real-time, in-context sensing and analytics into devices, mobile apps, streaming and interactive experiences
To compare the experience of watching programmes on its new 2019 OLED TV with that of one of its older models, Realeyes and LG took part in a fun experiment involving twins and Game of Thrones at the tech giant’s testing facility in Weybridge.
Using webcams mounted on top of the TVs, our AI platform analysed in real time their facial expressions, body language and head movements to see which twin was more entertained.
The result? The twin watching on the 2019 model was more attentive and his emotions were more intense.
Mirror, mirror on the wall, who is the happiest of them all? That’s a question answered by Realeyes’ #EmotionMirror, which we first exhibited at Cannes Lions in 2014.
The ‘mirror’, was created using large screens and front-facing cameras, provided festival-goers with a perfect demonstration of how facial coding works, matching the facial landmarks to people’s expressions.
The futuristic ‘Destination U’ prototype is a first of its kind, innovative way for holidaymakers to choose a trip that matches their emotional needs.
As video plays to trigger imaginations, facial coding and emotion measurement tap into the customer’s subconscious.
An algorithm computes every subtle facial response to a rapid series of evocative moving images of destinations and experiences, and uses that data to calculate a ‘perfect holiday’ prescription.
Our technology works with any device camera or webcam and the SDK has been designed to analyze spontaneous facial expressions in response to experiences.
Computer vision identifies key areas of the face, shown as landmarks on the eyes, eyebrows, nose, mouth etc. We’ve taught machines to map these landmark positions to different classifications of emotions and recognise high and low attention levels from the position of the face.
Our patented technology has been trained and tested using our emotion data repository of millions of faces from across the world.
Our SDK can enable you to make your technology more human by embedding real time emotion sensing and measuring attention levels in to your customer experience.
Our SDK supports major platforms including iOS, Android, Windows, Linux, and macOS and is designed for developer ease-of-use, taking just a few hours to set-up to emotion enable your application.
allows for head movement
(glasses, beards etc)
Happy, surprise, confusion, contempt, disgust, empathy, fear
world-wide annotations
Most advanced on the market
Engagement, Negative, Valence