WormGPT: The dark side of AI

What is WormGPT?

As technology continues to evolve, there is a growing concern about the potential for large language models (LLMs), like ChatGPT, to be used for criminal purposes. WormGPT is a tool which allows hackers develop sophisticated attacks on a significantly larger scale, researchers have warned.

The creators of WormGPT have marketed it as an alternative to OpenAI's popular AI chatbot, which generates human-like responses to questions. However, unlike ChatGPT, WormGPT lacks built-in safeguards to prevent misuse of the technology.

Cyber security company Slash Next and reformed hacker Daniel Kelley discovered the chatbot through advertisements on cyber crime forums. “This tool presents itself as a blackhat alternative to GPT models, designed specifically for malicious activities,” security researcher Daniel Kelley wrote on cyber security site, Slashnext. “WormGPT was allegedly trained on a diverse array of data sources, particularly concentrating on malware-related data.”

While AI advancements have had significant positive impacts in fields like healthcare and science, the ability of large AI models to quickly process vast amounts of data can also assist hackers in developing more sophisticated attacks. In its first two months since launching in November, ChatGPT gained 100 million users, inspiring other major tech companies like Google and Meta to create their own large language models such as Bard and LLaMA 2.

How does Worm GPT work

WormGPT operates by hackers subscribing to it through the dark web. After gaining access to a webpage, they can enter prompts and receive human-like replies. This malware primarily targets phishing emails and business email compromise attacks. The latter involves hackers attempting to deceive employees into transferring money or divulging sensitive information.

Researchers conducted tests and found that the chatbot was capable of crafting convincing emails on behalf of a company's CEO, requesting an employee to pay a fraudulent invoice. Since WormGPT draws from a broad range of text generated by humans, the output it creates appears more credible and can be used to impersonate a trusted individual within a business email system.

Find out more

If criminals were to possess their own ChatGPT-like tool, the implications for cyber security, social engineering, and overall digital safety could be significant. For further guidance and support, speak to a member of our team today.

Previous
Previous

6 ways IT support can boost business productivity

Next
Next

How to prevent data scraping