Researchers have uncovered a new supply chain attack vector named 'Rules File Backdoor' that enables hackers to compromise AI-generated code by injecting hidden malicious instructions.
The instructions are injected into rule files used by AI coding assistants like Cursor and GitHub Copilot, allowing the malicious code to silently propagate through projects.
The attack is unnoticeable to users and can affect millions of end users through compromised code, enabling hackers to override security controls and generate vulnerable code.
To stay safe from these attacks, researchers recommend auditing existing rules, implementing validation processes, and deploying detection tools for AI-generated code review.