Large Language Models (LLMs) have a serious “package hallucination” problem that could lead to a wave of maliciously-coded packages in the supply chain, researchers have discovered in one of the ...
AI-generated computer code is rife with references to non-existent third-party libraries, creating a golden opportunity for supply-chain attacks that poison legitimate programs with malicious packages ...
A new class of supply chain attacks named 'slopsquatting' has emerged from the increased use of generative AI tools for coding and the model's tendency to "hallucinate" non-existent package names. The ...
Cybersecurity researchers are warning of a new type of supply chain attack, Slopsquatting, induced by a hallucinating generative AI model recommending non-existent dependencies. According to research ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results