Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Tech

Five things you should never reveal to ChatGPT if you want to protect your privacy

People turn to ChatGPT for all kinds of things — couples therapy, help with writing a professional email, turning pictures of their dogs into humans — letting the artificial intelligence platform in on some personal information.

And apparently, there are a few specific things you should never share with the chatbot.

When you type something into a chatbot, “you lose possession of it,” Jennifer King, a fellow at the Stanford Institute for Human-Centered Artificial Intelligence, told the Wall Street Journal.

There are ways to protect your privacy when using an AI chatbot. Getty Images

“Please don’t share any sensitive information in your conversations,” OpenAI writes on their website, while Google urges Gemini users not to “…enter confidential information or any data you wouldn’t want a reviewer to see.”

On that note, here are the five things no one should tell ChatGPT or an AI chatbot.

ChatGPT has a “Temporary Chat” mode which acts in a similar manner to an internet browser’s incognito mode. CFOTO/Future Publishing via Getty Images

Identity information

Don’t reveal any identifying information to ChatGPT. Information such as your Social Security number, driver’s license and passport numbers, as well as date of birth, address and phone numbers should never be shared.

Some chatbots work to redact them, but it’s safer to avoid sharing this information at all.

“We want our AI models to learn about the world, not private individuals, and we actively minimize the collection of personal information,” an OpenAI spokeswoman told WSJ.

Medical results

While the healthcare industry values confidentiality for patients to protect their personal information as well as discrimination, AI chatbots are not typically included in this special confidentiality protection.

If you feel the need to ask ChatGPT to interpret lab work or other medical results, King suggested cropping or editing the document before uploading it, keeping it “just to the test results.”

There are five specific things that one should not tell ChatGPT or an AI chatbot. REUTERS/Dado Ruvic/Illustration

Financial accounts

Never reveal your bank and investment account numbers. This information can be hacked and used to monitor or access funds.

Login information

It seems that there could be reasons to provide a chatbot with your account usernames and passwords due to the rise of their ability to perform useful tasks, but these AI agents aren’t vaults and don’t keep account credentials secure. It’s a better idea to put that information into a password manager.

Proprietary corporate information

If you’re using ChatGPT or other chatbots for work — such as for drafting emails or editing documents — there’s the possibility of mistakenly exposing client data or non-public trade secrets, WSJ said.

Some companies subscribe to an enterprise version of AI or have their own custom AI programs with their own protections to protect from these issues.

If you still want to get personal with the AI chatbot, there are ways to protect your privacy. According to WSJ, your account should be protected with a strong password and multi-factor authentication.

Privacy-conscious users should delete every conversation after it’s over, Jason Clinton, Anthropic’s chief information security officer, told the outlet, adding that companies typically permanently get rid of “deleted” data after 30 days.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button