Search
Close this search box.

SEWA Project

Creating an ad recommendation engine

Identifying Subtle Behaviors

Funded by the EU’s Horizon 2020 programme, this project focuses on automatic sentiment analysis in the wild that will quantify the correlation between behaviors and emotions.

The project explores behavioural indicators that were previously too subtle for emotion measurement – hand gestures that include the face such as holding the chin, twirling hair or biting their nails are all important behavioural clues.

This project has been covered by the Wall Street Journal.

Facial, Vocal and Verbal Analysis

By partnering with Imperial College and the University of Passau, the academic leaders in the fields of sentiment detection from computer vision and audio analysis, we aim to develop computers’ abilities to analyse and understand people’s emotions and behaviors using facial, vocal and verbal analysis.

We want to develop the ability to automatically detect more complex behavioral and affective states than the six basic emotions – whether a person likes or dislikes what they’re seeing, for instance, or whether they’re bored.

A Human Element to Programmatic

We’re leveraging the technologies developed within this project to create predictive models that will lead to adding a human element to programmatic decision-making – an ‘Ad Recommendation Engine’ that matches content to audiences and audiences to content using emotional profiles.

Maja Talks to Charlie Rose

Realeyes Scientific Advisor, Maja Pantic, Professor of Affective Computing at Imperial was interviewed by Charlie Rose on a CBS 60 Minutes report on Artificial Intelligence.

CBS 60 Minutes

Realeyes Scientific Advisor, Maja Pantic, Professor of Affective Computing at Imperial was interviewed by Charlie Rose on a CBS 60 Minutes report on Artificial Intelligence.

As Maja explains, enabling machines to better understand people adds a human element to data-driven decision-making, ultimately ensuring that computers are better able to make the right decisions.

Equally, emotion-enabled technology could change the way we interact with technology around us – be it our computers, our phones, the music we listen to or the videos we’re watching. Eventually, computers will be better than we are at reading emotions – and able to detect nuances than humans might miss.

“...computers will be better than we are at reading emotions – and able to detect nuances than humans might miss.”

“...detect when facial movements like chewing or drinking are getting in the way of authentication.”

Speech X-Ray Project

We were also awarded an additional grant for the ‘SpeechXRay project’, which focuses on building the next generation audio-visual user recognition platform to be used for authentication purposes, pushing the boundaries of what technology can do for security, privacy and usability

We’ll be using our facial coding technology to detect when someone may be under duress, or when facial movements like chewing or drinking are getting in the way of authentication. The applications of automated facial coding are many and varied and hardly limited to advertising – from health and education to gaming and security. We’re proud to be part of a strong academic community that is constantly working to push the boundaries of what technology can do to improve people’s lives.

Discover More

Our pioneering emotionAI provides the best-in-class reporting to power
informed decisions. See how we can help companies like yours.