‘ChatGPT can land you in jail, as it may…’ says LinkedIn user: Here’s why

While companies like OpenAI state they use chat data to improve their models and for safety monitoring, the acknowledgment by OpenAI’s CEO regarding the potential for chat logs to be used as court evidence highlights the risks. 

chatgpt agent
Jaiswal concludes with a warning, "Let me make this simple – if you wouldn’t say it in front of a judge, don’t type it into ChatGPT." (Image: Reuters)

After OpenAI CEO Sam Altman confirmed why it isn’t a good idea to share secrets with ChatGPT or any other AI chatbot, a LinkedIn member has added on to that, suggesting why sharing secrets with an AI chatbot could land you in jail. Shreya Jaiswal, Founder at Fawkes Solutions, CA, Marketer, Influencer, Podcaster, and Speaker, sheds light on the lack of privacy in interactions with AI tools like ChatGPT. 

Jaiswal says that “ChatGPT can land you in jail. No, seriously. Not even joking.” According to her, the very words of OpenAI CEO Sam Altman confirm this unsettling reality. “Sam Altman – the CEO of OpenAI, literally said that anything you type into ChatGPT can be used as evidence in court. Not just now, even months or years later, if needed. There’s no privacy, no protection, nothing, unlike talking to a real lawyer or therapist who is sworn to client confidentiality,” she writes.

LinkedIn member explains Sam Altman’s warning

In her post, Jaiswal illustrates the gravity of this issue with several hypothetical, yet alarmingly possible, scenarios:

Imagine confiding in ChatGPT, “I cheated on my partner and I feel guilty, is it me or the stars that are misaligned?” Jaiswal warns, “Boom. You’re in court 2 years later fighting an alimony or custody battle. That chat shows up. And your ‘private guilt trip’ just became public proof.”

Innocently asking “How do I save taxes using all the loopholes in the Income Tax Act?” or “How can I use bank loans to become rich like Vijay Mallaya ?” could backfire spectacularly. “During a tax audit or loan default, this could easily be used as evidence of intent even if you never actually did anything wrong,” she states.

Even planning a career change and seeking advice from AI on it could be dangerous. Jaiswal posts, “I’m thinking of quitting and starting my own company. How can I use my current company to learn for my startup? Now imagine this chat coming up in court when your employer sues you for breach of contract or IP theft. You don’t even need to have done anything. The fact that you thought about it is enough.”

Jaiswal warns about complacency

Jaiswal warns about the widespread complacency surrounding AI tools. “We’ve all gotten way too comfortable with AI. People are treating ChatGPT like a diary. Like a best friend. Like a therapist. Like a co-founder,” she notes. “But here’s the thing, it’s none of those, it’s not on your side, it’s not protecting you. And legally, it doesn’t owe you anything,” she warns.

Jaiswal concludes with a warning, “Let me make this simple – if you wouldn’t say it in front of a judge, don’t type it into ChatGPT.” 

“I’m honestly scared. Not because I have used ChatGPT for something I shouldn’t have. But because we’ve moved too fast, and asked too few questions, and continue to do so in the world of AI,” she added.

While companies like OpenAI state they use chat data to improve their models and for safety monitoring, the acknowledgment by OpenAI’s CEO regarding the potential for chat logs to be used as court evidence highlights the risks. 

Get live Share Market updates, Stock Market Quotes, and the latest India News
This article was first uploaded on August two, twenty twenty-five, at twenty-four minutes past twelve in the night.
X