Meta is taking legal action against a company called Joy Timeline for advertising generative AI apps that enable users to 'nudify' people without consent on Meta's platforms.
The lawsuit follows a CBS News investigation that revealed numerous ads for these digital undressing apps on Meta's platforms.
Meta aims to prevent Joy Timeline from listing its ads for CrushAI nudify apps across its social media platforms as the company tried to circumvent Meta's ad review process.
Ads for AI deepfake nude tools were found on Instagram even after Meta removed ads flagged by the investigation.
The ads targeted men in the US, UK, and EU, violating safety policies, driving blackmail schemes, and often ending up in children's hands.
Such apps are predominantly advertised for use against women and female celebrities.
404 Media also reported on nonconsensual AI deepfake creation tools on Instagram in April 2024, leading Apple and Google to remove flagged apps from their marketplaces.
San Francisco filed a lawsuit against 16 frequently visited AI 'undressing' deepfake websites in August 2024.