Google removed over 100 YouTube videos after promoting AI deepfake porn, including apps used for non-consensual image alteration. Forbes uncovered videos advertising AI apps removing clothes from images, with victims facing bullying.
The platform also hosted ads for "deep nude" services. Google removed 27 ads and YouTube took down 11 channels and 120+ videos. NCOSE criticized Google's profit cycle from nudify apps.
AI-generated deepfake porn, even of children, is a rising concern. Apple promptly removed similar apps, contrasting with Google's response. Victims of AI exploitation expressed ongoing trauma. AI's misuse remains a pressing issue for online safety.