If you have a habit of sharing everything with ChatGPT, hold on! Consider the following stats:
From June 2022 to May 2023, over 100,000 OpenAI ChatGPT account credentials of paid users were compromised and sold on dark web marketplaces. Now consider your ChatGPT account falling prey to such a cyberattack and unluckily for you, it knows all your credentials and personal secrets from your previous conversations. What would you do?
That is exactly what cybersecurity experts are warning about. We as a society have accepted AI chatbots to our digital best friend, in whom we confess all our secrets. Cybersecurity experts, however, strictly advise against it.
Experts warn that despite being convenient and powerful, these platforms pose significant risks if sensitive data is unknowingly or intentionally disclosed. After all, these chatbots feed on the data you provide for training purposes and at times, end up keeping a record. Additionally, giving too much access to your credentials and personal information makes these AI tools dangerous for personal privacy.
It doesn’t matter which chatbot is your preferred choice. Be it OpenAI’s ChatGPT, Google’s Gemini, Perplexity and Anthropic, you are at risk if you have shared your credentials and secrets with the bots.
The security tips, shared by AgileBlue – a cybersecurity firm – have highlighted five critical categories of information that should always be kept confidential when interacting with ChatGPT or similar AI tools. They say that once information is entered into these systems, its privacy and security can no longer be guaranteed.
Hence, here are the five key information types to avoid sharing with ChatGPT and any other AI chatbot.
Personally Identifiable Information (PII)
This includes any data that can uniquely identify an individual. This includes core information like full names, dates of birth, social security numbers, addresses, phone numbers, and email addresses. Sharing such personal information can lead to severe consequences, including identity theft, financial fraud, and unauthorised access to personal accounts.
Financial and Banking information:
Details like credit card numbers, bank account credentials, and other payment information should never be shared with AI systems. Breaches of this data can result in fraudulent transactions, drained accounts, and financial instability. Experts stress the importance of using only encrypted and secure channels for financial transactions, not relying on Agentic AI to make transactions on your behalf.
Passwords and Login credentials:
Digital keys to online identities, passwords, PINs, and security answers should never be shared with ChatGPT. Sharing such credentials creates a significant vulnerability for unauthorised access by malicious entities. Security experts advise two-factor authentication and dedicated password managers in case you have ended up sharing credentials with your AI chatbot.
Private or Confidential information:
This contains personal secrets, intimate details and sensitive work-related information. AI systems lack the human context and understanding, which is needed to protect such kind of data. Hence, sharing it makes accidental disclosure a high risk. For professionals, sharing confidential business information can lead to breaches of trust, legal issues, and harm to an organisation’s competitive edge, leading to breach of contract and eventual layoff.
Proprietary or Intellectual Property:
Experts also warn against sharing Intellectual property, including patented ideas, copyrighted material, trade secrets, and proprietary code. Basically, experts say that you shouldn’t share anything that represents significant value. Sharing this with ChatGPT risks theft, unauthorised use, and potential legal disputes. Experts suggest that protecting these innovations is crucial for individuals and organisations to maintain ownership rights and commercial value.
Top tip: Your secrets and credentials should never be shared with any digital entity. Whether it be a friend or an AI chatbot, you should refrain from sharing your secrets with anyone online.
