Businesses will prioritize building AI technologies that can interpret and respond to human emotions as they look to connect with consumers.
Over the last decade, artificial intelligence has gone from buzzword to a must-have business competence. From retail to healthcare to financial services, AI is penetrating nearly every industry, with advances in deep learning, computer vision, and more paving the way.
download the 12 Tech Trends To Watch Closely In 2022 report
Download our full report to find out the top trends poised to reshape industries in 2022.
AI, though, has largely been challenged when it comes to recognizing and reacting to human emotion. In fact, the AI Now Institute at New York University called for a ban on the use of emotion recognition tech “in important decisions that impact people’s lives and access to opportunities” in its 2019 report.
But the attempt to use AI to recognize and respond to emotion, or emotion AI, isn’t a new concept — and in 2021, as political and social pressures continue to push tech companies to account for a wider range of human experiences, emotion AI will become an increasing priority.
The idea is largely associated with American scholar and inventor Rosalind Picard and her early research on the topic — also known as affective computing, or “computing that relates to, arises from, or deliberately influences emotions.” Today, the $87B global market for affective computing has far-reaching potential, and interest in the space has been gradually building.
Machines employing emotional artificial intelligence attempt to interpret human emotion from text, voice patterns, facial expressions, and other non-verbal cues — and in many cases, simulate those emotions in response. In tapping into unspoken behaviors and reactions, businesses can leverage this “emotional data” to increase their gains and better cater to customers.
Emotion AI company Affectiva, for example, has found that advertisers are becoming increasingly effective at drawing out emotional responses from consumers:
“From our work with 70% of the world’s largest advertisers and 28% of the Fortune Global 500 companies, we’ve found that emotionally resonant ads improve sales results.” — Graham Page, Global Managing Director of Media Analytics at Affectiva
The tech could also change how companies interact with customers. Companies like Behavioral Signals and Cogito use emotion AI to analyze elements of speech, like tone and vocal emphasis, to best match service agents and customers across industries.
Widespread adoption of the tech could also feed into industries like medicine.
For example, researchers are using deep learning techniques to capture facial expressions of pain to help detect discomfort, an especially useful approach for when patients cannot verbally communicate. Others are leveraging AI emotion detection software to determine levels of joy or negative emotion of facial palsy patients, pre- and post-surgery.
In another example, Amazon’s health and wellness tracker Halo integrates voice analysis and machine learning to analyze how positive or energetic users sound based on emotions like happiness, sadness, or excitement in their voice.
While still in the early stages of development, emotion AI tech for the automotive industry also has tremendous upside potential. Computer vision is already being leveraged for driver monitoring, where systems are being built to help identify driver fatigue, for example. Now, some automakers’ increased priority on assessing emotion, from stress to anger, could add another layer of insight to improve road safety and occupant comfort.
Hyundai, for example, is developing Emotion Adaptive Vehicle Control (EAVC) technology in partnership with MIT that can optimize the environment of a vehicle based on passengers’ emotional states. The automaker recently unveiled a concept car designed to transport children in hospitals that uses AI-based tech to monitor facial expressions, heart rate, and respiratory rate, along with other factors such as car acceleration, to adjust vehicle systems like lighting, climate, and music.
Affectiva has also been working on an in-cabin sensing solution, called Automotive AI, since 2018. The company has partnered with car manufacturers like BMW and Porsche, and it has numerous patent grants related to the assessment of emotion. Its most recently granted patent, titled “Image analysis for emotional metric evaluation,” looks to analyze “emotional context” from facial images — and could see applications across the in-vehicle experience.
Emotion AI has the potential to change the way we operate across industries. Though as with any AI, privacy and transparency concerns, as well as the risk of bias and ethical considerations, play a huge factor in its development.
For example, Amazon’s AI-powered recruiting tool reportedly penalized resumes that included the word “women’s.” Google’s algorithm for detecting hate speech on social media was found to disproportionately flag Black users’ tweets before it was corrected. The room for error when it comes to something as subjective as one’s emotional state is large.
But emotion AI — if used with caution — could benefit both businesses and consumers alike. Recently, IBM (in partnership with Airbus and the German Aerospace Center) relaunched an “AI-powered astronaut assistant” named CIMON. Beyond scientific assistance, CIMON is expected to act as an “empathetic companion” while in space — which could lead to outsized mental health benefits for astronauts who are on already stressful, and potentially lonely, missions.
Moving forward, companies developing emotion AI tech will need to navigate the complexities of handling emotional data (especially when accounting for multiple reactions at once, like in a car full of passengers, for example) — which is far more sensitive and intangible than other forms of personal data — to account for cultural differences in emotional expression and the potential for bias in their algorithms.If you aren’t already a client, sign up for a free trial to learn more about our platform.