Russian cybercriminals are trying to bypass restrictions on ChatGPT in order to use the capabilities of the advanced AI-based chatbot for their nefarious purposes.
The Israeli company Check Point Research, which works in the field of IT security, reported that its specialists noticed numerous discussions on underground forums where hackers discuss various methods of registration on the ChatGPT service. These include using stolen payment cards to pay for upgraded user accounts on OpenAI, bypassing geofencing restrictions and using a “Russian semi-legal online SMS service”.
ChatGPT is a new artificial intelligence (AI) chatbot that has gained a lot of attention due to its versatility and ease of use. But it is not available in all countries of the world. Yes, users from Russia, Belarus, China, Iran, Venezuela, etc. cannot register here.
Cybersecurity researchers have already seen hackers use this tool to create believable phishing emails as well as code for malicious macros in Office files. We will remind about recently experiment conducted by Check Point Research cyber security specialists. After simple queries, the chatbot generated code that can be used for malware.
However, abusing the tool is not so easy, as OpenAI imposes a number of restrictions, and Russian hackers have to overcome even more obstacles in connection with Russia’s invasion of Ukraine. But, according to the experts of the threat intelligence group at Check Point Software Technologies, these obstacles are not enough.
“Bypassing OpenAI measures that restrict access to ChatGPT for certain countries is not that difficult. Right now, we see Russian hackers already discussing and testing how to bypass geofencing to use ChatGPT for their malicious purposes. We believe that these hackers are most likely trying to use ChatGPT in their day-to-day criminal operations. “Cybercriminals are increasingly interested in ChatGPT because the AI technology behind it can make the hacker’s work more cost-effective,” experts say.
But hackers don’t just want to use ChatGPT – they’re also trying to capitalize on the growing popularity of the tool, spreading all sorts of malware and stealing money. For example, an application was spotted in the App Store that pretended to be a chatbot, but its monthly subscription cost about $10. Other apps, which were also found on Google Play, charged up to $15 per use.
Read also:
Leave a Reply