Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • @them
    link
    English
    461 year ago

    Yes, lets name the tool in the article so everybody can participate in the abuse

      • DarkThoughts
        link
        fedilink
        101 year ago

        Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.

        • RaivoKulli
          link
          fedilink
          English
          111 year ago

          Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.

          • DarkThoughts
            link
            fedilink
            41 year ago

            Of course, which isn’t even the problem but rather people using the edited pictures for things like blackmail or whatever. From a technical standpoint it isn’t too dissimilar to the old photoshopping. Face swapping can probably even provide much higher quality results, especially if you have a lot of source material to pull from (you want like matching angles for an accurate looking result). Those AI drawn bodies often have severe anatomical issues that make them very obvious and look VERY different to their advertisement materials.

    • @[email protected]
      link
      fedilink
      English
      61 year ago

      You can literally Google ‘AI nude generation tool’ and get multiple results already. And I do sort of agree with you as I’m not sure how naming this specific tool was necessary or beneficial here. But I don’t think not naming it is going to prevent anyone interested in such a tool from finding one. The software/tool itself is (currently) not illegal.