A sex offender convicted of making more than 1,000 indecent images of children has been banned from using any “AI creating tools” for the next five years in the first known case of its kind.

Anthony Dover, 48, was ordered by a UK court “not to use, visit or access” artificial intelligence generation tools without the prior permission of police as a condition of a sexual harm prevention order imposed in February.

The ban prohibits him from using tools such as text-to-image generators, which can make lifelike pictures based on a written command, and “nudifying” websites used to make explicit “deepfakes”.

Dover, who was given a community order and £200 fine, has also been explicitly ordered not to use Stable Diffusion software, which has reportedly been exploited by paedophiles to create hyper-realistic child sexual abuse material, according to records from a sentencing hearing at Poole magistrates court.

  • @shneancy
    link
    English
    -122 months ago

    I don’t believe those two are comparable.

    Weed and meth are rather different in how they affect people.

    AI images are often used as a way to imitate reality

    • @yokonzo
      link
      English
      62 months ago

      It doesn’t matter if you believe it, for those who lived through D.A.R.E and the war on drugs, that argument was common and on plenty of people’s lips. It’s a stupid argument but I think that’s the point OP is trying to make

      • @shneancy
        link
        English
        -52 months ago

        then why is that person repeating a stupid argument at me? those aren’t comparable at all.

        A better comparison would be idk, CBD weed with no THC being legal and that being the “gateway” to normal weed. Or buying a knock off product and wanting to try the original. Or looking at AI generated photos people eating spaghetting and wanting to see how it actually looks like

        • NιƙƙιDιɱҽʂ
          link
          English
          32 months ago

          It’s a stupid argument being juxtaposed with your argument…you’re so close, you got this.