Here’s how ChatGPT destroyed influencer’s vacation plans, crying video goes viral

Spanish influencer Mery Caldass’s dream vacation was spoiled after she claims she relied on advice from ChatGPT, leading to a viral crying video from an airport.

‘ChatGPT has destroyed my holiday..' Influencer's viral crying video blames Sam Altman's OpenAI bot for vacation nightmare
‘ChatGPT has destroyed my holiday..' Influencer's viral crying video blames Sam Altman's OpenAI bot for vacation nightmare (Mery Caldass/TikTok)

What happens when you ask AI to plan your vacation? For a Spanish influencer, it turned out to be a nightmare. Spanish influencer Mery Caldass’s dream vacation was spoiled after she claims she relied on advice from ChatGPT, leading to a viral crying video from an airport. The influencer had solely relied on visa document information for going to Puerto Rico from ChatGPT, only to be misled by the chatbot and end up abandoning her vacation eventually.

Caldass, who has nearly 100,000 followers, had planned to travel to Puerto Rico with her partner for a romantic vacation. However, she missed her flight after the AI chatbot allegedly gave her incorrect information about visa requirements.

“I asked ChatGPT and he said no,” she said in a video posted on TikTok. While European Union citizens do not need a visa for short stays, an Electronic System for Travel Authorization (ESTA) is required for travel to the US territory, which the OpenAI chatbot missed out on sharing with Claldass. Hence, despite all the flight and accommodation bookings, the influencer and her partner missed out on the trip.

‘AI took revenge’, says Spanish influencer

This incident has highlighted a growing concern about the dangers of using AI chatbots for critical and fact-based information. Caldass admitted her mistake, saying that she should have sought more reliable information. She even joked that the AI was getting “revenge” on her because she “sometimes insults him.” She added, “I don’t trust that one anymore,” pointing to the AI.

This isn’t the first time that ChatGPT has misled a user to an extent that affected their life greatly. A couple of weeks ago, another story emerged of ChatGPT sending a 60-year-old man to the hospital after bromism. The report by the American College of Physicians Journals revealed that after following ChatGPT’s dietary recommendations, the man had replaced sodium chloride with sodium bromide. ChatGPT had recommended that chloride can be swapped with bromide, though likely for other purposes, such as cleaning. The incident led to bromide poisoning and created severe health consequences.

Are AI chatbots unreliable?

AI chatbots are still in their early stages, regardless of the high claims their makers make. Hence, it is always advised to cross-check information obtained from any AI chatbot. AI bots often tend to hallucinate and mix-up information, leading to unreliable outcomes. Hence, while AI bots tend to make life easier, nothing beats a good old-school Google search and other reliable sources for checking crucial information.

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on August nineteen, twenty twenty-five, at twelve minutes past eleven in the morning.
Market Data
Market Data