New Cybersecurity Threat: 'Slopsquatting' by Generative Code AI

Cybersecurity experts have identified a worrying new threat to the software supply chain, dubbed " slopsquatting," stemming from the increasing use of artificial intelligence (AI) tools for code generation. This risk materializes when AI "hallucinates," or invents, software components that don't actually exist.
The phenomenon occurs when a code-generating AI, when queried by a developer, suggests software packages or libraries with plausible names, but which are not actually part of any legitimate repository.
Researchers in the United States found that nearly a fifth of the software packages recommended by certain AI tools were entirely fabricated.
This situation creates a window of opportunity for malicious actors. Hackers can detect these "duh" package suggestions and create fake versions with those exact names, embedding malicious code in them and publishing them in software repositories.
An unsuspecting developer, relying on the AI's suggestion (especially if the name is similar to a legitimate one or appears repeatedly), could inadvertently download and integrate this malicious package into their own project, compromising the security of the final software and potentially exposing users to risks.
Although no active instances of these malicious packages have yet been found in repositories, the research underscores the critical importance of developers carefully verifying and scanning all AI-suggested software components before using them, avoiding blind trust in automatically generated code. This risk adds to the debate about AI's role in diverse areas, from medical adjuncts to marketing and ad creation.
Follow us on our X La Verdad Noticias profile and stay up to date with the most important news of the day.
La Verdad Yucatán