Subscribe

Emotion artificial intelligence becomes a reality

Simnikiwe Mzekandaba
By Simnikiwe Mzekandaba, IT in government editor
Tunisia, 17 Nov 2017
May Amr, senior data engineer and director of Cairo operations at Affectiva.
May Amr, senior data engineer and director of Cairo operations at Affectiva.

Devices will in future be able to read and sense emotions and respond the same way an emotionally intelligent friend would.

This is according to May Amr, senior data engineer and director of Cairo operations at Affectiva, speaking at the International Telecommunication Union's 2017 World Telecommunication/ICT Indicators Symposium in Tunisia this week.

Amr participated in a panel discussion on measuring emerging ICT trends and the intended vision for these new technologies, and said her organisation imagines that in the next five years devices will be emotionally aware.

"We are living in a world surrounded by hyper-connected devices that have massive cognitive abilities. Most of these devices are very smart at making decisions but these decisions are transactional and linear. The devices have high IQ but no EQ."

According to Amr, her company's focus on emotion artificial intelligence (AI) stems from the fact that emotions influence every aspect of human life. This ranges from areas such as health and wellbeing, communicating with others, doing business, as well as every decision a person makes.

Imagine a world where doctors can objectively measure a patient's emotional state the same way they do their medical state, she suggested.

"The merger of IQ and EQ in tech nowadays is more inevitable. Emotion AI has the potential to democratise access to services like education and healthcare, closing the gap of the socio-economic state.

"Emotion AI will not just transform how we connect with the devices, but how we connect and communicate with one another."

Amr said the Affectiva team is building technologies that can respond to emotions and is already analysing people's facial expressions. To date, the organisation has amassed 60 billion emotion data points. These were collected from six million faces analysed in 87 countries across the globe, she told the audience.

"The starting point was the face. The face happens to be one of the most powerful signals that we all use in our daily communication ? everything from frustration, confusion, curiosity and other different emotions. It contributes 55%, and 38% is how you say the words. Only 7% are the actual words."

Although teaching a machine all the different facial action units is not easy, it can be done. "A machine is like a student; it needs to be provided with examples so that it can learn about the subject it needs to take in information about."

If a machine is given enough examples, it will learn to detect whether a face is smiling, smirking or doing any different action, Amr noted.

Share