- Get four Apple AirTags for just $73 with this Black Friday deal
- I tested Beats' new Pill speaker and it delivered gloriously smooth sound (and it's on sale for Black Friday)
- These Sony headphones are a fan favorite - and $150 off for Black Friday
- I tested a 'luxury' nugget ice maker, and it's totally worth it - plus it's $150 off for Black Friday
- The Dyson Airwrap is $120 off ahead of Black Friday - finally
ChatGPT Used to Develop New Malicious Tools
Cyber-criminals have continued using OpenAI’s ChatGPT to develop new malicious tools, including infostealers, multi-layer encryption tools and dark web marketplace scripts.
The news comes from Check Point Research (CPR) experts, who published a new advisory about the findings last Friday.
“In underground hacking forums, threat actors are creating infostealers, encryption tools and facilitating fraud activity,” the company told Infosecurity via email.
In particular, CPR discovered three cases of recent observations related to using ChatGPT for nefarious purposes.
The first one, spotted in a dark web forum on December 29, 2022, relates to recreating malware strains and techniques described in research publications and write-ups about common malware.
“In actuality, whilst this individual could be a tech-oriented threat actor, these posts seemed to be demonstrating [to] less technically capable cyber-criminals how to utilize ChatGPT for malicious purposes, with real examples they can immediately use,” wrote CPR.
The second type of malicious activity observed by the security researchers in December 2022 describes the creation of a multi-layered encryption tool in the Python programming language.
“This could mean that potential cyber-criminals who have little to no development skills at all could leverage ChatGPT to develop malicious tools and become fully-fledged cyber-criminals with technical capabilities,” explained CPR.
Finally, the team spotted a cyber-criminal writing a tutorial on how to create dark web marketplace scripts using ChatGPT.
“The marketplace’s main role in the underground illicit economy is to provide a platform for the automated trade of illegal or stolen goods like stolen accounts or payment cards, malware, or even drugs and ammunition, with all payments in cryptocurrencies,” reads the advisory.
According to Sergey Shykevich, threat intelligence group manager at CPR, ChatGPT can be used for good to assist developers in writing code, but it can also be used for malicious purposes, as proven by the aforementioned cases.
“Although the tools that we analyze in this report are pretty basic, it’s only a matter of time until more sophisticated threat actors enhance the way they use AI-based tools,” Shykevich warned. “CPR will continue to investigate ChatGPT-related cybercrime in the weeks ahead.”
Additionally, Check Point data group manager Omer Dembinsky predicts AI tools like ChatGPT will continue to fuel cyber-attacks in 2023.
The advisory comes weeks after cybersecurity experts first warned that ChatGPT could democratize cybercrime.