I test AI for a living and these are 7 things I would never tell ChatGPT – here’s why
Chatbots — from customer support brokers to AI assistants like ChatGPT and Gemini — have grow to be a part of our each day lives. Whereas all of them provide comfort, not all chatbots prepare the identical approach, which is why many customers don’t understand how a lot delicate info they may by accident reveal.
Some dangers are apparent, like never sharing your bank card quantity or financial institution info, however others are extra delicate.
This is why you must suppose twice earlier than sharing sure data
In a latest function about Gemini Canvas, I talked about how the chatbot helped enhance the primary chapter of my e-book. Some readers expressed concern that sharing my writing with a chatbot meant it would be used to coach the AI.
Fortunately, that’s not the case, no less than not with the key gamers. OpenAI and Google clearly state that they don’t use consumer inputs to coach their chatbots. Nevertheless, there’s a nuance price noting: even when your information isn’t getting used for coaching, it could actually nonetheless be remembered inside your account. Which means anybody with entry to your account — or in uncommon circumstances, somebody who hacks into it — may theoretically retrieve your enter. Extremely unlikely, however not inconceivable.
After which there are the larger dangers. Not all chatbots observe the identical information practices. Whereas ChatGPT and Gemini keep away from coaching on consumer inputs, some human overview can nonetheless happen to flag abuse or dangerous content material. Your unfinished novel in all probability received’t elevate eyebrows, however threats or harmful language would possibly.
Different bots, like DeepSeek, do prepare instantly on consumer information. Which means something you sort in could possibly be used to enhance future fashions — and that’s a good cause to be cautious.
So, no matter which chatbot you’re chatting with, right here are seven things you must never share — sure, even with those that don’t prepare in your enter.
1. Private identifiable info (PII)
Never share your full title, tackle, social safety quantity (or equal), passport or driver’s license particulars. If the chatbot’s information is breached, hackers may steal your identification.
In the event you use a chatbot for one thing reminiscent of updating your resume or job looking out, never embrace that info when prompting the chatbot. Add it on the finish when sprucing your resume.
2. Monetary info
It appears evident that sharing bank cards, checking account particulars, and cryptocurrency non-public keys is a unhealthy concept. Customers could unknowingly embrace this info when summarizing a doc from their bank card firm or financial institution.
Maybe you need tax recommendation and lean on a chatbot for help, nevertheless it’s important to know that you may get your questions answered with out sharing this info. As a substitute, use related eventualities with out revealing private data.
3. Passwords
When chatting with a bot on a firm web site, you must never share your electronic mail passwords or two-factor authentication (2FA) codes. Legit companies will never ask for this info through chat.
As a result of well-known bots like ChatGPT keep in mind info that will help you, that information could possibly be probably utilized by somebody who makes an attempt to make use of your account. For instance, password suggestions like your mom’s maiden title or childhood pet ought to never enter the chat.
4. Extremely delicate or embarrassing secrets and techniques
In the event you’re dying to share your participation in unlawful actions or wish to confess deeply private admissions, do not share them along with your favourite chatbot.
Personal well being points are additionally one thing you must never share, as some chatbots log conversations for coaching or keep in mind to tailor the dialog to your wants higher.
5. Firm info
Product prototypes, confidential assembly notes, or journey plans of the CEO are all forms of info that some customers would possibly overlook and ought to never be utilized in chatbot conversations.
Whereas it could appear to be a nice concept to make use of a chatbot for secret prototypes, these forms of inputs are never a good concept.
6. Specific or dangerous content material
Graphic violence, threats, and hate speech shouldn’t be a a part of your chatbot expertise.
Whilst a joke (“How do I conceal a physique?”), some AI programs flag and report such content material to authorities.
7. Medical info
Though it’s no substitute for a medical skilled, some ChatGPT customers discover the chatbot helps determine varied signs. In the event you are the kind of consumer who does this, you’ll wish to make sure you display screen your prompts to keep away from any private info.
Don’t enter any remedy prescriptions or embrace medical charts. As a substitute, attempt a immediate reminiscent of “What forms of workout routines construct muscle for an anemic lady aged 25-30?” Be basic about your self inside the immediate.
Closing ideas
AI chatbots are unimaginable instruments — however they’re not journals, therapists, or safe vaults. Whereas firms like Google and OpenAI have guardrails, it’s nonetheless sensible to be selective about what you share.
Understanding how totally different bots scrape for information and deal with inputs is step one in defending your privateness. When unsure, preserve delicate, private, or inventive info offline — or no less than, out of the chat window.
Bear in mind: should you wouldn’t need it repeated, reviewed, or resurfaced later, it in all probability doesn’t belong in a chatbot inquiry.