At a time when the youth is hooked to various online dating apps, interestingly, artificial intelligence (AI) has taken a plunge to play cupid. So how does this work? Typically a user’s avatar who is the chatbot talks to the match. It is believed that AI-based romance chatbots can offer companionship and emotional support in a world where loneliness and social isolation are prevalent. People are expected to be drawn to AI romance for its convenience, anonymity, and the illusion of understanding and empathy it provides. However, while these chatbots may initially seem harmless fun, they can potentially be a trap of emotional dependency and unrealistic expectations. “The impersonal nature of AI interactions can exacerbate feelings of loneliness and disconnect in users, ultimately affecting their mental health. Constant engagement with AI romance chatbots can also distort one’s perception of genuine human relationships, leading to difficulties in forming and maintaining meaningful connections offline,” Rajat Goel, co-founder, Emoneeds, an online psychiatric counselling and therapy platform, told FE-TransformX.
The era of AI-romance
As of December 2023, ASAPP, an AI-based customer experience management platform, was the most funded chatbot and conversational AI worldwide, with about $380 million, as per insights from Statista, a market research platform. Industry experts believe that such funding can add to the number of AI-based romance scams. There has been about an 80% year-on-year (YoY) increase in romance scams, as per insights from the American Federal Trade Commission. Additionally, $1.3 billion was lost in 2021 with about $2400 as a median loss in romance scams. AI-powered chatbots offer companionship, or the allure of it, but the privacy risk attached to it is a matter of concern.
Case in point, an analysis of about 11 romantic chatbot applications released on February 14, 2024, by Mozilla Foundation, a non-profit market research platform, mentioned that almost every application sells user data for targeted advertising or doesn’t provide adequate information about it in their privacy policy. “With AI learning from user data, there’s uncertainty about data security and ownership. Also, chatbots programmed to send links or messages that appear to be from legitimate sources can lead users to fake websites designed to steal personal information, login credentials and financial details, among others,” Ravi Mittal, founder and CEO, QuackQuack, a dating platform, explained.
Romancing AI: The love game
Industry experts believe that AI-based romance chatbots don’t implement security standards such as strong passwords and other measures to mitigate security vulnerabilities in the product itself. This eventually increases the potential for breaches which could expose data and invade user privacy for fraudulent activities. It is believed beyond the data collection issues, lies the unsettling capability of these chatbots to manipulate emotions. This emotional manipulation has the potential to expose users to the darker aspects of online interactions, such as harassment, violence, and exposure to extremist ideas, among others. “ In essence, the exploration of AI companionship invites us to tread carefully, creating a fun yet responsible relationship with technology. The key is believed to lie in maintaining awareness, prioritising privacy, and embracing technology with a discerning eye,” Hariom Seth, founder, Tagglabs, an AI-based marketing company, highlighted, adding that the consideration of alternatives, such as human interaction or mental health resources, for practicing social skills or overcoming loneliness, can add a nuanced perspective to the equation.
Experts believe AI-based romance chatbots are black boxes to creators because these are built on generative algorithms and models someone else has developed and trained. In addition, technologies such as deepfakes and voice cloning, among others, can improve the impact of AI-based romance chatbots, giving it a more realistic touch. However, “ Psychological impacts from the continued usage of these chatbots are yet to be understood. Potential threats can exist for users if there is a pathological dependency on these bots, impacting real-life relationships. The negative consequences of these chatbots might tip the balance making them more dangerous than useful,” RV Raghu, ISACA India ambassador; director, Versatilist Consulting India Pvt Ltd, an information technology (IT) governance platform, concluded.

