The rise of AI in businesses has led to an unexpected trend: companies are turning to AI to help their bots understand human emotion. This emerging field, known as “emotion AI,” is predicted to be on the rise, according to PitchBook’s Enterprise SaaS Emerging Tech Research report.
The reasoning behind this trend is simple: if businesses are deploying AI assistants and chatbots as frontline salespeople and customer service representatives, it is crucial that these AI systems understand the nuances of human emotions. Emotion AI goes beyond sentiment analysis, which attempts to distill human emotion from text-based interactions. Instead, it employs sensors for visual, audio, and other inputs, combined with machine learning and psychology, to detect human emotion during interactions.
Major AI cloud providers, such as Microsoft Azure and Amazon Web Services, offer services that give developers access to emotion AI capabilities. These services, like Microsoft Azure cognitive services’ Emotion API and Amazon Web Services’ Rekognition service, enable developers to incorporate emotion AI into their AI systems.
While emotion AI has been around for some time, the sudden increase in the use of bots in the workforce has given it more relevance in the business world than ever before. Emotion AI promises to enable more human-like interpretations and responses in AI systems, making them more effective in their roles.
The hardware side of emotion AI relies on cameras and microphones, which can be integrated into devices like laptops and phones or located in physical spaces. Wearable hardware is also expected to provide another avenue for employing emotion AI.
To meet the growing demand for emotion AI, a number of startups have been launched, including Uniphore, MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis. These startups have raised substantial amounts of funding and are focused on developing technologies to enhance emotion AI capabilities.
However, there are challenges to overcome in implementing emotion AI. Research conducted in 2019 raised doubts about the effectiveness of using facial movements to determine human emotion. The assumption that AI can detect human feelings by mimicking how humans read faces, body language, and tone of voice may be misguided.
Furthermore, AI regulation, such as the European Union’s AI Act, may restrict the use of emotion detection systems in certain contexts, such as education. State laws, like Illinois’ BIPA, also prohibit the collection of biometric readings without permission.
Overall, the rise of emotion AI provides a glimpse into the future of AI in the workplace. AI bots may attempt to understand and respond to human emotions in roles such as customer service, sales, and HR. However, it remains to be seen whether these AI systems will be effective in tasks that require emotional understanding. The future may see an office filled with AI bots that lack the capability to accurately interpret human emotions, similar to Siri in 2023. The question then becomes whether it is worse to have an AI bot guessing at everyone’s feelings in real time during meetings or to have no emotional understanding at all.