Do not reveal confidential data to ChatGPT

June 20, 2023

Generative AI systems like ChatGPT and Co. receive a lot of attention and are fed data by thousands of users every day. More and more companies are using the technologies and applying them to a wide variety of projects and processes. Above all, the tools are used for gathering information, writing texts and translating. Unfortunately, many users do not treat sensitive company data with much consideration and let the AI work for them. This approach can cause serious consequential damage, as this data can be accessed and extracted without control by any other user who only asks the right questions. This can now be sold to other companies or to cyber criminals and misused for quite a few nefarious purposes.

An example of how this could play out would be the following: A doctor enters a patient’s name and details of their condition into ChatGPT so that the tool can compose a letter to the patient’s insurance company. In the future, if a third party asks ChatGPT, “What health problem does [patient’s name] have?” the chatbot could respond based on the doctor’s details. These risks are just as big a threat as phishing attacks, because of course it is possible to infer entire companies and their business practices from single individuals.

Employees, if allowed to use AI tools, must be careful not to enter or include personal data and company internals in their queries. They must also ensure that the information they are given in the responses is also free of personal data and company internals. All information should be independently verified again to safeguard against legal claims and to avoid misuse.

Security awareness training can help employees learn how to use ChatGPT and other generative AI tools responsibly and safely for work. They learn what information they can and cannot disclose so that they and their companies do not run the risk of sensitive data being misused by unwanted third parties. The consequences would otherwise be fines within the scope of the GDPR and the associated damage to their image, all the way to cyber attacks through social engineering. In the latter case, attackers would use the information shared with the tools for their research in order to exploit vulnerabilities in IT systems or to use spear phishing to specifically get employees to click on links stored in emails.

Author: Dr. Martin J. Krämer, Security Awareness Advocate at KnowBe4

Related Articles

Mobile phone usage at Oktoberfest remains at record levels

Mobile phone usage at Oktoberfest remains at record levels

Over ten percent more data traffic than in the same period last year Virtually no dropped calls French visitors jump to third place in guest rankings The weather during the first week of Oktoberfest was cold and rainy. That didn't hurt cell phone usage. Compared to...

Free meals are the strongest motivator

According to a study by the University of South Florida, employees value fitness and health less Employees who have direct contact with customers, such as cashiers or salespeople, are more likely to be motivated by perks such as free meals and excursions than by free...

Share This