The ability to feel emotions is considered one of the few things that separates man and machine - but now a new strand of AI is aiming to teach robots how to sense if we are happy or sad

Robot heart (Credit: Sean McMenemy/Flickr)

Robot heart (Credit: Sean McMenemy/Flickr)

Emotional intelligence is a crucial aspect of interpersonal interaction and the ability to detect the many subtle variations of human emotion is something of which the vast majority of AI currently isn’t capable.

But this could change with the advent of emotion AI – a new development in artificial intelligence that companies hope will allow personal assistants and robots to have more human-like interactions.

Business insight firm Gartner predicts that by 2022, 10% of personal devices will have emotion AI capabilities, either on-device or via cloud services, up from less than 1% in 2018.

The company’s analyst Annette Zimmermann, speaking at the Gartner Analytics and Data Summit in London today (6 March), said: “Emotion AI will be an integral part of machine learning in the next few years – the reason being that we want to interact with machines that we like.

“In the future, we will be interacting with smart machines much more than we do today so we need to train these machines with emotions so they can understand humans better, and make interactions more comfortable and natural.”

 

What is emotion AI?

Emotion AI – or emotional artificial intelligence – is the process of giving machines the ability to recognise and react to human emotion.

The technology has the ability to tell whether a customer is angry or upset, whether a driver is feeling tired or if a patient is feeling lonely or depressed.

The primary methods of detection are computer vision and audio analysis.

But biometric sensors, such as heart rate monitors, can support these technologies too.

A convolutional neural network, a method of machine learning used to analyse visual data, is typically trained on millions of images of people from different countries – under different lighting and from different angles.

The programme identifies “landmarks” in the face and certain expressions to determine emotion classifiers.

emotion ai
Affectiva are one of the companies pioneering emotion AI technology (Credit: Affectiva)

Companies currently creating emotion AI technology claim they can detect between six and ten human emotions.

Voice detection systems assess speech rate, pitch intensity and voice quality, which are key emotion markers.

Ms Zimmermann said: “What’s really fascinating is that most of the technology vendors I’ve spoken to are able to detect what a human is feeling, no matter which language they are speaking in.”

 

Uses of emotion AI

Several companies are already investigating the use cases for emotion AI.

One area that stands to benefit from the use of the technology is the car insurance sector, which paid out £6.4bn to UK claimants in the first half of 2019.

A study by insurance company Zurich found that 20% of UK adults admitted lying to their insurance company.

Intelligent Voice, an audio recognition software that performs emotion AI, is working with insurers and banks in the UK and US to detect fraud, and conduct credibility analyses from the sound of people’s voices.

The technology is able to tell whether people are making up a story or whether it really happened from the tone of their voice, and could represent a multi-billion pound opportunity for the insurance industry.

AI in call centres, emotion ai
Emotion AI is capable of detecting whether someone is happy or angry by analysing the sound of their voice

German software company audEERING is developing similar audio intelligence software for call centres.

An intelligent routing system uses emotion AI to detect angry customers and directs them to trained call centre agents, who are more experienced with dealing with those sorts of calls.

It claims this helps with both customer and employee satisfaction by spreading the income of difficult calls between call centre staff.

With more robots making their way into the healthcare industry, the ability for machines to react to social cues and human emotion becomes increasingly important.

Health-tech company Catalia Health developed a robot called Mabu for this exact reason.

Patients with long-term chronic conditions are often treated from home and it is not always possible to send a nurse out on a daily basis.

Mabu conducts daily conversations with the patient, checks how they are feeling and regularly asks whether they have completed their treatment programme.

By having a greater understanding of the patient’s situation, the robot is able to interact more naturally with people.

Other applications include New Zealand-based company Soul Machines’ virtual banking assistant in spending decisions and, in recruitment, the use of emotion AI to screen video interviews and help HR make a faster decision about hiring new staff.

 

Emotion AI ethical considerations

People remain very sceptical about artificial intelligence, and the power of emotion AI in particular.

A poll of 4,000 people by Gartner found that 48% of respondents were uncomfortable with AI being used to detect emotion in people’s voices, while 52% did not like the idea of the technology being used to analyse facial expressions to understand how we feel.

emotion ai
Anette Zimmermann, Gartner analyst, speaking at the Gartner Data Analytics Summit (Credit: Sam Forsdick/Compelo)

Ms Zimmermann said: “The perception of emotion AI is definitely an issue at the moment and I think this will require a lot of public education and transparency from companies to resolve.

“Emotional data is so personal that many people may not want to be part of that so it is important for people to be told when this technology is being used and why.”