Searching for “undress apps,” “best deepfake nudes,” and similar terms on Google turns up “promoted” results for apps that use AI to produce nonconsensual nude images, despite a recent update to Google’s ad policies against this exact type of content and a recent effort to derank results for apps that produce nonconsensual content.

The promoted results, which were discovered by Alexios Mantzarlis in the Faked Up newsletter, are yet another example of how the biggest internet platforms are struggling to contain the flood of AI-powered apps that create nonconsensual sexual images, mostly of young women and celebrities. In this case, Google Search didn’t only lead users to these harmful apps, but was also profiting from the apps which pay to place links against specific search terms.

“We reviewed the ad in question and permanently suspended the advertiser for violating our policies. Services that offer to create synthetic sexual or nude content are prohibited from advertising through any of our platforms or generating revenue through Google Ads,” a Google spokesperson told me in an email.

      • @beebarfbadger
        link
        11 month ago

        That is not true at all.

        Bad actors do everything in their power to get richer at the cost of everybody they can rip off as much as possible, legally or otherwise.