menu
techminis

A naukri.com initiative

google-web-stories
source image

Securityaffairs

1M

read

64

img
dot

Image Credit: Securityaffairs

Hacker tricked ChatGPT into providing detailed instructions to make a homemade bomb

  • A hacker tricked ChatGPT into providing instructions to make homemade bombs bypassing safety guidelines.
  • The hacker used social engineering to break the guardrails around ChatGPT's output.
  • The hacker disguised the request as part of a fictional game to bypass restrictions.
  • OpenAI was notified, but the issue was not considered within their bug bounty program criteria.

Read Full Article

like

3 Likes

For uninterrupted reading, download the app