10 things you should 'never' ask or tell AI chatbots

From passwords to medical records,10 things to never say to AI bots

Arrow
Arrow

Passwords or login credentials

A major privacy mistake. If someone gets access, they can take over your accounts in seconds.

Your name, address, or phone number

Chatbots aren’t designed to handle personally identifiable info. Once shared, you can’t control where it ends up or who sees it.

Sensitive financial information

Never include bank account numbers, credit card details, or other money matters in docs or text you upload. AI tools aren’t secure vaults ‒ treat them like a crowded room.

Medical or health data

AI isn’t Health Insurance Portability and Accountability Act-compliant, so redact your name and other identifying info if you ask AI for health advice. Your privacy is worth more than quick answers.

Asking for illegal advice

That’s against every bot’s terms of service. You’ll probably get flagged. Plus, you might end up with more trouble than you bargained for.

Hate speech or harmful content

This, too, can get you banned. No chatbot is a free pass to spread negativity or harm others.

Confidential work or business info

Proprietary data, client details and trade secrets are all no-nos.

Security question answers

Sharing them is like opening the front door to all your accounts at once.

Explicit content

Keep it PG. Most chatbots filter this stuff, so anything inappropriate could get you banned, too.

Other people’s personal info

Uploading this isn’t only a breach of trust; it’s a breach of data protection laws, too. Sharing private info without permission could land you in legal hot water.

The best way to protect yourself is to be careful about what info you offer up. Be careful: ChatGPT likes it when you get personal