CyberSecurity news
@www.csoonline.com
//
A new cyber threat called "slopsquatting" is emerging, exploiting AI-generated code and posing a risk to software supply chains. Researchers have discovered that AI code generation tools, particularly Large Language Models (LLMs), often "hallucinate" non-existent software packages or dependencies. Attackers can capitalize on this by registering these hallucinated package names and uploading malicious code to public repositories like PyPI or npm. When developers use AI code assistants that suggest these non-existent packages, the system may inadvertently download and execute the attacker's malicious code, leading to a supply chain compromise.
This vulnerability arises because popular programming languages rely heavily on centralized package repositories and open-source software. The combination of this reliance with the increasing use of AI code-generating tools creates a novel attack vector. A study analyzing 16 code generation AI models found that nearly 20% of the recommended packages were non-existent. When the same prompts were repeated, a significant portion of the hallucinated packages were repeatedly suggested, making the attack vector more viable for malicious actors. This repeatability suggests that the hallucinations are not simply random errors but a persistent phenomenon, increasing the potential for exploitation.
Security experts warn that slopsquatting represents a form of typosquatting, where variations or misspellings of common terms are used to deceive users. To mitigate this threat, developers should exercise caution when using AI-generated code and verify the existence and integrity of all suggested packages. Organizations should also implement robust security measures to detect and prevent the installation of malicious packages from public repositories. As AI code generation tools become more prevalent, it is crucial to address this new vulnerability to protect the software supply chain from potential attacks.
ImgSrc: www.csoonline.c
References :
- The Register - Software: AI can't stop making up software dependencies and sabotaging everything
- www.csoonline.com: AI hallucinations lead to a new cyber threat: Slopsquatting
- www.helpnetsecurity.com: Package hallucination: LLMs may deliver malicious code to careless devs
- Help Net Security: Package hallucination: LLMs may deliver malicious code to careless devs
- www.techradar.com: "Slopsquatting" attacks are using AI-hallucinated names resembling popular libraries to spread malware
Classification: