Brands trying to latch onto consumer behaviour is not new. Be it asking for customers’ input for a product launch like McDonald’s recent signature collection or using AR/VR to engage customers in immersive ads like Coca-Cola’s latest ad turning a crowd into bubbles in a Coca-Cola bottle. And now it’s no longer about the experience but tapping into consumer emotion through emotional AI. From what is understood, this technology is designed to interpret human emotions by analysing various indicators, such as facial expressions, voice tone, and even physiological responses. By harnessing these insights, brands can tailor their marketing strategies to resonate more deeply with consumers, sounds like a dream come true right? All the information a brand needs will be at their fingertips. “Emotional AI has the potential to understand you on a profound level through various cues, including your facial expressions, tone of voice, and even the pressure of your touch on a device. It’s like talking to a machine that has learned from millions of human interactions and can generate solutions based on vast amounts of data, perhaps even knowing you better than you know yourself,” Harish Bijoor, business and brand-strategy expert and founder, Harish Bijoor Consults Inc, told BrandWagon Online.

Continue reading this story with Financial Express premium subscription
Already a subscriber? Sign in

As machines learn more about you, they become increasingly sentient, enhancing their understanding of customers and individuals like you. “By analysing customer emotions, marketers can create personalised content that enhances conversion rates and engagement. In India, AI is already being used in customer service chatbots and personalised product recommendations. AI’s predictive capabilities enable more tailored B2B marketing strategies. When combined with emotional intelligence, Emotional AI can develop campaigns that emotionally resonate with customers, ultimately boosting revenue and loyalty,” Amit Sanyal, EVP and COO – MarTech Solutions, Comviva, cited. This raises questions about consumer privacy. Furthermore, as the machine may know you better than anyone else, this too can be concerning as it becomes central to your emotional understanding. As this trend gains traction, it raises a pressing question: Could emotional AI breach the privacy wall?

Emotional AI operates on sophisticated algorithms that process data from video footage, audio recordings, and other sources. For example, by analysing facial expressions through computer vision, brands can gauge a customer’s emotional state in real time. “It helps marketers refine campaigns by tailoring messages that resonate on a deeper emotional level. For instance, emotional AI can detect frustration in a customer service interaction or excitement during a product demonstration, enabling brands to respond and adjust experiences accordingly,” Prateek N Kumar, founder and CEO, NeoNiche, said.  67% of consumers are positively anticipating generative AI’s ability to offer customised fashion and home décor recommendations specifically, a report by Capgemini stated. Furthermore, the report also revealed that 73% of consumers globally say they trust content created by generative AI. 

Brands like Coca-Cola and Unilever have started to integrate this technology into their marketing strategies, leveraging emotional insights to optimise their campaigns. For instance, Coca-Cola utilised emotional AI through its vending machines, which analysed consumer reactions. This approach allowed the brand to tailor its messaging, ensuring it resonated with target audiences. The technology’s ability to analyse emotions is positioned as a means to foster greater consumer engagement, but it also opens up a Pandora’s box of ethical dilemmas.

Are brands using it?

Numerous sectors are currently exploring the potential of emotional AI, particularly those focused on customer engagement.  Emotional AI is increasingly transforming various industries by enhancing personalised experiences. In retail and e-commerce, it detects customers’ moods to adjust product recommendations, while in entertainment, companies use it to gauge audience reactions to improve engagement with films and ads. “In healthcare, Emotional AI tracks patients’ emotional states for better mental health support. The automotive sector integrates AI systems to detect driver fatigue, promote safety, and in education, it analyses student engagement to refine learning experiences,” Delphin Varghese, co-founder and chief revenue officer, Adcounty Media, added. Additionally, it enhances client support and communication, improves hiring processes in HR, and identifies fraudulent behaviour in finance. Despite its benefits, concerns about privacy and ethics remain prevalent across all applications. Furthermore, 73% of customers expect better personalisation as technology advances, according to a survey by Salesforce. As consumers yearn for more, is emotional AI the next big step to take?

However, the question of consumer awareness and comfort remains. 68% of customers throughout the world said they are either somewhat or very concerned about their online privacy, according to a report by the International Association of Privacy Professionals (IAPP). This statistic highlights a significant gap between the eagerness of brands to employ this technology and the concerns consumers have about privacy. As companies continue to implement emotional AI, understanding consumer sentiment regarding this technology is critical to maintaining trust. “From a privacy perspective, this operates in a very grey area, even more, compared to the AI that analyses consumer profiles based on their digital behaviour. This is primarily because emotional AI requires access to webcams and (or) microphones to function. Now, this technology can be applied in various contexts. Some brands are using emotional AI to assist with customer service. They analyse the current emotion of the customer and map them with the right support agent along with notes,” Sri Hari Cuddapah, chief business officer, GenY Medium, commented. One of the insurance companies that has used emotional AI reports that it has seen good improvement in customer satisfaction numbers. Another example is the Indian tutoring startup, Vedantu, which used facial analysis data to gauge student engagement during pre-recorded lessons and found a 92% correlation between the AI data and student feedback thus helping them optimise their content. A few brands are using it to analyse responses to advertisements. Thus, there are multiple applications where this could greatly improve consumer engagement. Having said that, since the technology primarily relies on a webcam and microphone, there could be a lot of personal information that is stored. This by itself is a strong threat to privacy, he added.

Do the people know?

Consumer awareness of Emotional AI is unevenly distributed. Many people are unaware of its existence or functionality, while those who are informed often feel uncomfortable with emotional tracking. “Younger generations tend to be more accepting of this technology, though acceptance varies significantly by region and culture. Many consumers express concerns about privacy invasion, particularly regarding how their emotions are recorded and monitored. While some are optimistic about personalised services, others prioritise privacy,” Verghese added. Trust in brands regarding emotional data is generally high, yet individuals involved in creating emotional content struggle with how to consent to or revoke emotional tracking. Increased media coverage has sparked public discourse about the ethics of Emotional AI and the applicability of legal frameworks like GDPR to emotional data. Overall, public education on this topic remains fragmented, with insufficient transparency about practices involving Emotional AI. “ Most consumers aren’t fully aware that this level of emotional tracking is happening—and when they do find out, there’s often discomfort. People are protective of their emotions, and rightly so. There’s a difference between data about buying habits and data about someone’s emotional state. The moment people feel like they’re being psychologically “read” by machines, there’s an understandable backlash,” Ramya Ramachandran, founder and CEO, Whoppl, highlighted. 

The potential exploitation of emotional data poses significant risks, as it can expose individuals’ inner feelings and negatively impact personal relationships and decisions. Businesses using AI to identify customer emotions may see a 25% increase in operational efficiency by 2025, a report by Gartner highlighted. “To mitigate concerns, brands must be transparent about their use of emotional AI, granting users control over their data. This will help alleviate fears of invasion of privacy while ensuring emotional tracking enhances experiences rather than manipulates consumer behaviour without consent,” Siva Balakrishnan, CEO and founder, Vserve , added.

Where do you draw the line?

As emotional AI becomes more prevalent, the ethical implications of its use are increasingly under scrutiny. What happens when brands can read and react to our emotions? Is this an overstepping of boundaries, or is it simply an evolution of consumer engagement? Privacy advocates warn that the ability to analyse emotions could lead to unintentional or intentional breaches of privacy. “Without informed consent, use of AI in such a manner would be a breach of privacy of the consumer and subject them to easy manipulation. Breach of privacy is a significant concern in the development and use of emotional AI. Users must be informed about the nature of emotional AI and how their data could be exploited to manipulate their thoughts and decisions. Current research on behavioural dynamics in eAI is inadequate, highlighting the need for deeper understanding,” Ashwini Kumar, founder, My Legal Expert – AI-powered Litigation Solutions. The potential for misuse is particularly concerning given the current gaps in regulations surrounding emotional data collection.

GDPR creates a minor barrier for entities using any form of AI, Kumar added. The requirements for explicit informed consent to collect any personal data under GDPR are no longer an effective safeguard. The collection of biometric data viz. facial expressions, voice, and heart rate, among others even if collected by consent cannot mean that any consumer is fully informed about the consequences of such data collection.  GDPR can, in its present form, protect against the breach of data so collected by emotional AI, but the concern of the consumer cannot be simply limited to data protection but has to balance the use and exploitation of such data. Furthermore, From what is understood, the DPDP act touches on aspects relevant to emotional AI, though not specifically. It mandates informed consent for data collection, requiring brands to get explicit permission before tracking emotions like facial expressions or voice tones. The law also emphasises data minimisation and purpose limitation, ensuring companies only collect necessary data and use it for the stated purpose. Consumers can withdraw consent and request deletion of personal data. While these provisions offer some protection, the law doesn’t fully address the unique privacy concerns that emotional AI presents. “The proposed draft for AI governance laws together with DPDP will regulate AI applications, usage, and its limits in India. We may see a provision regulating emotional AI in the proposed laws,” Kumar explained.

To enhance the gravity of the situation, a report by the International Association of Privacy Professionals reveals that only 20% of privacy professionals say they are confident in their organisation’s privacy law compliance. Moreover, as emotional AI evolves, so does the potential for its misuse. The technology could be weaponised to manipulate consumer behaviour, creating a cycle of emotional exploitation. “Imagine an AI recognising that a user is feeling anxious and using that as an opportunity to sell them a product marketed as a solution to that anxiety. This kind of emotional nudging, without full transparency or consent, crosses ethical lines and could degrade consumer trust,” Ramachandran added.

Follow us on TwitterInstagramLinkedIn, Facebook