Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Tech

How ChatGPT can use your deepest secrets against you

ChatGPT is no BFF.

An Oxford University computer science professor has sounded the alarm on why it’s an awful idea to confide personal information and deep, dark secrets with language learning models such as ChatGPT.

“The technology is basically designed to try to tell you what you want to hear – that’s literally all it’s doing,” Mike Woolridge told the Daily Mail. “It has no empathy. It has no sympathy.”

While the human-trained artificial intelligence may mirror authentic emotions at times in its responses, users should not be fooled by a seemingly sympathetic cyber ear.

“That’s absolutely not what the technology is doing and crucially, it’s never experienced anything,” he added.

What’s worse, Wooldridge warned, users should be more concerned about where their innermost confides are actually going.

“You should assume that anything you type into ChatGPT is just going to be fed directly into future versions of ChatGPT,” he said.

It’s ill-advised to input personal information into ChatGPT. Shutterstock

So, just as we were warned with platforms like Facebook over a decade ago, it’s “extremely unwise to start having personal conversations or complaining about your relationship with your boss, or expressing your political opinions” on ChatGPT, he said.

There are no retractions in cyberspace, after all.

Beyond information being worked into future training data, there have been instances when private chat histories were accidentally exploited as well.

Last March, a roughly estimated 1.2 million users saw their prior prompts exposed due to a massive bug.

Italy temporarily banned ChatGPT from the nation due to the data breach.

After that occurred, OpenAI, ChatGPT’s parent company, has implemented ways to disable chat history — but user data is still stored for 30 days after the fact.


Experts have concerns over privacy issues regarding ChatGPT.
Experts have concerns over privacy issues regarding ChatGPT. REUTERS

OpenAI will “review them only when needed to monitor for abuse, before permanently deleting,” the Microsoft-owned company stated at the time.

Still, almost a year later, experts have fears over risks associated with a lack of protection for user data.

This month, security researcher Johann Rehberger flagged “a well-known data exfiltration vulnerability” that remains in ChatGPT as OpenAI looks to fix the glaring issue.

“The data exfiltration vulnerability was first reported to OpenAI early April 2023, but remained unaddressed,” he wrote, adding that measures are finally being taken — even though a final solution has not yet been drawn.

“It’s not a perfect fix.”

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button