A joint study by OpenAI and MIT Media Lab has raised concerns about the psychological effects of prolonged ChatGPT usage, suggesting that frequent interactions with the AI chatbot may contribute to increased loneliness and emotional dependence.
The study examined over 40 million ChatGPT interactions and surveyed 4,000 users to assess the emotional and behavioral impact of AI conversations. Additionally, researchers conducted a randomised controlled trial with 1,000 participants who engaged with ChatGPT for at least five minutes daily over four weeks. The findings revealed that users who formed emotional connections with the chatbot or viewed it as a companion were more likely to report feelings of social isolation.
“Higher daily usage—regardless of conversation type—was strongly correlated with increased loneliness, dependence, and reduced socialisation,” the report stated. A deeper analysis of ChatGPT’s Advanced Voice Mode, which enables real-time spoken conversations, provided further insights. Users engaged in two distinct interaction styles: a neutral mode, where the AI maintained a formal, detached tone, and an engaging mode, where it expressed emotions. Initially, voice-based conversations appeared to reduce loneliness compared to text-based exchanges. However, at higher usage levels, these benefits diminished, particularly when users interacted with the neutral-mode chatbot.
The study’s findings come amid growing concerns about AI’s role in human relationships. While ChatGPT was not explicitly designed for emotional companionship, some users rely on it for support, mirroring trends seen in AI companion platforms like Replika and Character.ai. These companies have faced scrutiny over their potential psychological impact, particularly on vulnerable users.
Gender differences were also observed in the study. Women who engaged with the chatbot for extended periods were slightly less likely to socialise compared to men. Furthermore, participants who frequently communicated in emotionally expressive modes felt less isolated than those who interacted with the neutral, robotic-sounding chatbot.
The study underscores the complexity of AI-human interactions, with researchers calling for further investigation into chatbot design choices and their long-term psychological effects. Experts suggest that future AI development should prioritise emotional management without fostering dependence or replacing human relationships. With OpenAI’s recent release of GPT-4.5, which features enhanced emotional intelligence, the discussion around responsible AI use is more relevant than ever.