menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Data Science News

>

Developers...
source image

Analyticsindiamag

1w

read

93

img
dot

Image Credit: Analyticsindiamag

Developers Beware! AI Coding Tools May Aid Hackers

  • Researchers have uncovered a new supply chain attack vector named 'Rules File Backdoor' that enables hackers to compromise AI-generated code by injecting hidden malicious instructions.
  • The instructions are injected into rule files used by AI coding assistants like Cursor and GitHub Copilot, allowing the malicious code to silently propagate through projects.
  • The attack is unnoticeable to users and can affect millions of end users through compromised code, enabling hackers to override security controls and generate vulnerable code.
  • To stay safe from these attacks, researchers recommend auditing existing rules, implementing validation processes, and deploying detection tools for AI-generated code review.

Read Full Article

like

5 Likes

For uninterrupted reading, download the app