Criminal intelligence WormGPT – ChatGPT’s evil little brother

AIs like ChatGPT are able to generate usable texts. But filters and other security measures are designed to prevent use for criminal purposes. WormGPT is different .

Artificial intelligence (AI) has learned a great deal in recent years. Until recently, AI-generated texts sounded wooden at best and like incomprehensible gibberish at worst, but AI like ChatGPT are now able to create texts that can hardly be distinguished from those of a human author. This also attracts criminals who want to improve their phishing emails with the help of AI and commission malware, or have the AI ​​look for errors in the code of websites that can be exploited for hacker attacks.

But there is a problem: AI is only as good as the training material fed in – and of course the developers of ChatGPT and Co. do not want their technology to be used for hacker attacks. The use of hacking-specific data is therefore avoided when training the AI ​​and security measures are taken and filters installed to prevent cyber criminals from benefiting from the technology.

However, it was only a matter of time before resourceful businessmen developed their own AI for cybercriminalsOffer Darknet – and that’s exactly what happened with WormGPT. WormGPT is an AI model based on GPT-J, a large language model developed by EleutherAI in 2021. It has unlimited character support, chat storage and code formatting features. Using it without appropriate security filters has resulted in an AI service that poses a significant security risk.

For example, WormGPT can be used for so-called business email compromise attacks (BEC) . With this phishing variant, company employees are to be enticed under various pretexts into handing over large sums of money to the criminals or disclosing confidential company data – and this is exactly what WormGPT was trained by its developers to do. In addition, the AI ​​can also personalize the content of the phishing emails to the specified target using data scraping and social engineering techniques. Data scraping involves extracting information from websites or databases. For example, information can be collected from a company’s website or from social media.

Hackers promote WormGPT on cybercriminal forums and use it to launch sophisticated and personalized phishing attacks. For example, the advice is to compose phishing emails in your native language, then machine translate them into the target language, and then use AI to make improvements and make the email more believable. In this way, perfect e-mails for phishing campaigns can be created without having to be able to speak a foreign language.

Unlike legitimate AI systems like ChatGPT, WormGPT is not available on the “normal” internet. If you want to use the hacker AI, you have to register on the dark web on the site of the backers and pay a fee in a cryptocurrency to gain access. However, this should be avoided at all costs, because almost everywhere in the world it is forbidden to use AI for criminal purposes. Anyone caught doing so should be prepared to face a fine or even imprisonment.

 

(c) it-daily