Machines Know How You Feel Through a Webcam

Here’s why that’s making our tech interactions more genuine and reciprocal. (Edited by Lulu Cao, Marketing Associate at Plug and Play)

Emotions play critical roles in our lives.

We experience a combination of emotions, happy, sad, positive, and negative, every second of the day. Emotions influence our thinking, behavior, work, and relationships. They also play a role in influencing our creativity, health, and well-being.

Recognizing emotions in a productive and beneficial way relies on Emotional Intelligence (EQ). A high EQ helps us to communicate more efficiently, become more connected to others, improves our productivity, efficaciously overcome our struggles, and overall, makes our lives better.

Imagine being able to tell if your presentation was engaging, if the jokes you prepared were funny, if your children like the lunch you made for them?

Today, we are surrounded by hyper-connected smart devices and the internet all the time. Our interactions with digital devices have become more interactive and conversational. We can even call an Uber by texting our messenger or buy groceries by talking to Amazon Echo.

Connecting with users depends more and more on the relatable dialogues these technologies make with customers. Notably, the hyper-connected systems need to improve their understanding of human emotion to adjust their operations accordingly. Our IoT (Internet of Things) team strives to advance our connected world through innovation. Emotion Research Lab is a startup in our accelerator program bringing “Artificial Empathy” for Human IoT. Together, our mission is to measure the emotional experience in real environments.

obama face machine learning

How does Emotion Recognition Technology work?

Emotion Research Lab only needs a webcam, PC, and the internet to capture a human face (micro expressions based on muscle movements). The artificial intelligence system based on machine learning algorithms categorizes the vision in the key metrics as activation, emotional pattern, valence, engagement, and enjoyment. The deep learning approach allows us to revise our algorithms for high-precision swiftly. Let us put the technology to the test with six participants, one chef, and six dishes in front of a webcam.

facial recognition AI

Unveiling dishes prepared by our chef, we saw Emotion Research Lab’s analysis in action. Catching emotions like surprise, happiness, and relief, we were able to see the participants’ true feelings.

A.I. for human emotion

After the tasting, using facial analysis algorithms, the Emotion Research Lab Team analyzed the footage according to the negative and positive emotions (happiness, surprise, anger, disgust, fear, anger, and a full range of secondary emotions). The sushi dish created the highest overall satisfaction from participants.

emotion intelligence technology

Emotion recognition technology can crack moods, a more complex combination of basic emotions. In real time, emotion recognition technology has the artificial empathy to analyze dynamic interactions.

In such cases, emotion recognition technology can provide online platforms for market research analysis and predict the performance of the new product. The eye tracking embedded in the webcam can offer a better understanding of the reactions and relation between the marketing stimulus and customer buying decisions by tracking the visual attention of a customer.

In shopping environments, emotion recognition technology can monitor a crowd with a simple camera. This offers retailers and brands emotional and attention (traffic, gender, age, active viewers) analytics captured into one, single cloud dashboard. In the market research and retail sector that means a dramatic decrease in the cost and time required in the process to know about our customer compared with the traditional market research methodologies.

Emotion A.I. also enables business opportunities in many verticals. Emotion recognition technology can improve learning processes to measure in real time the level of attention of the students and improve their performance.

In insurtech, new ways to rank the risk standards of the drivers through emotional patterns can bring new ways to increase their benefits. Also, the automotive industry with new self-driving vehicles is going to find in emotion recognition the best tool to monitor the driving experience. Emotion recognition technology is becoming the perfect tool to improve professional communications.

Emotion recognition technology can be used across industries to understand the feelings and reactions of people the interact with and eventually promote further connections between the technology and the users. In the future, emotion recognition technology will be integrated with our hyper-connected digital systems and improve our tech interactions by providing a more connected world, giving technology the ability to empathize. And this future is now.


If you liked what you read, please share it with friends.