The 89% Problem: How LLMs Are Resurrecting the "Dormant Majority" of Open Source
Blog post from Snyk
AI coding assistants are affecting the landscape of open source software by resurrecting abandoned packages, which poses new security risks in the software supply chain. Traditionally, developers have relied on social signals such as package popularity and community support to determine trustworthiness, but generative AI tools select packages based on statistical patterns from vast datasets, including outdated and unmaintained projects. This shift exposes developers to risks like unpatched vulnerabilities and AI hallucinations, where AI suggests non-existent or maliciously registered packages. To address these challenges, tools like Snyk Advisor and its Security Database aim to provide developers with visibility into package health by offering insights into security, maintenance, and community engagement, thus enabling informed decision-making when selecting dependencies. The approach emphasizes shifting focus from mere popularity to provenance and integrating real-time package health checks into development workflows to prevent untrusted packages from being incorporated into projects.