AI chatbots & their bias towards women

A new study also finds AI bots are overly empathetic.

Study finds AI Bots are overly empathetic. (Image Source: File photo)
Study finds AI Bots are overly empathetic. (Image Source: File photo)

Ever since generative AI chatbots have come into existence, they have been a rage. No matter how many safety concerns have been flagged, people still tend to use them to share their feelings and emotions — even considering them therapists and asking them to psychoanalyse their thoughts. And as the debate lurks on and people cannot stop comparing them with humans because essentially the scepticism around AI is that it has a certain edge over humans and just might replace humans in certain tasks. Some of the scepticism is true and some might be overstretching things a little bit. Instead of speculating, let’s talk about the most human thing ever — empathy and how AI chatbots fare on that front. We have a new study that is analysing how AI bots replicate human empathy.

A new study titled Talk, Listen, Connect: Navigating Empathy in Human-AI Interactions by researchers at Stanford University, Drexel University, and University of California tried to understand how empathy is expressed and received in human-human versus human-AI interactions, and what factors evoke empathy in humans vs AI. They also analysed how persona attributes — such as gender, empathic personality traits including empathic concern, perspective-taking, and shared experiences with storytellers — and fine tuning affect AI’s ability to improve their empathetic alignment with humans. They asked both a group of humans and OpenAI’s GPT-4o to read short stories of human positivity and negativity, and rate their empathy toward each story on a scale of one to five, and compared the responses.

The study found that GPT-4o evaluates empathy higher and with less variability compared to humans, particularly in the cognitive dimension of empathy. This means that GPT-4o struggles to fully grasp or communicate an understanding of human experiences. They found that the chatbot overall tends to be overly empathetic compared to humans. At the same time, they also found it fails to empathise during pleasant moments, a pattern that exaggerates human tendencies. “Humans tend to empathise when the storyteller’s situation is pleasant, whereas GPT-4o, even after fine-tuning, has not demonstrated awareness of this aspect,” the study said. They also found the chatbot empathised more when told the person it was responding to was female. This finding highlights and substantiates the gender bias concern of genAI chatbots. One of the researchers, Magy Seif El-Nasr from University of California, has been quoted as saying, “This finding is very interesting and warrants more studies and exploration and it uncovers some of the biases of LLMs,” adding, “it would be interesting to test if such bias exists in later models of GPT or other AI models”.

The researchers noted that results suggest that while fine-tuning improves the model’s empathy, GPT-4o still falls short in empathising with positive events and overemphasis on certain persona attributes. “These discrepancies could lead to unrealistic standards for human empathy which may affect user expectations and emotional experiences,” they said.

What is particularly noteworthy is that while personalised AI can improve empathetic engagement, it may also introduce biases or fail to address specific emotional needs. “In fields like healthcare, empathetic AI offers benefits but raises concerns about over-reliance and the need to preserve the value of human emotional support. We hope this research highlights the need to understand both the benefits and potential drawbacks of empathetic AI, and ensure it complements rather than replaces human interaction, while also addressing ethical and inclusivity concerns,” the researchers noted.

Advances in AI agents are transferring communication, particularly in mental health, where AI chatbots provide accessible, non-judgmental support. Studies on Replika, an AI chatbot, for example, have revealed that users formed stronger emotional attachments and deeper bonds with AI than with humans. It has also been found that intimate interactions with AI can evoke bittersweet feelings, overly human-like behaviour may induce fear. But, what remains the focal concern for researchers is how effectively these systems can express empathy, which is crucial in human-centered design. Researchers say this study highlights a gap in understanding how AI can authentically convey empathy, particularly as issues like anxiety, depression, and loneliness increase.

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on March fifteen, twenty twenty-five, at thirty-seven minutes past nine in the night.

Photo Gallery

View All
Market Data
Market Data