A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • @Gabu
    link
    English
    63 months ago

    There is no room for an ethical sexual relationship […]

    They didn’t argue otherwise - you’re attempting to attack their position on something you both agree on. Their statement (much like the one I made to a different person) is that both forms of attraction aren’t (necessarily) a choice by the individual. Their argument isn’t that paedophilia is harmless (your words), but that a person’s inherent brain chemistry and natural development can’t be considered immoral, regardless of context - this would also apply to schizophrenia, sociopathy, various imbalances such as bipolarity, autism and, yes, homosexuality. It is, at worst, amoral, necessitating social help in the cases that do lead to harmful behavior (which don’t apply to e.g. homosexuality/autism, but does to sociopathy or bipolarity).

    • archomrade [he/him]
      link
      fedilink
      English
      -23 months ago

      They used a careless comparison, and I’m only trying to unambiguously explain why that comparison is extremely misleading and potentially harmful.

      I made the comment that exposure to simulated CSAM or CSAM-adjacent material could later lead to a realization of those attractions due to the behavior being normalized and repeatedly modeled in sexualized content. cnt0 then made the comparison you are now making - that sexuality is not a choice, and normalization of a particular sexual expression is the same as any other -namely homosexuality. I unambiguously contest that comparison, because while a preference for a particular sexual expression isn’t a choice, normalizing sexual relationships with children could lead to the false-assumption that it is ok in some circumstances to pursue it. Normalizing ‘gay content’ (their words) is definitively not the same as normalizing underage sexual relationships, since there are no healthy ways to express that attraction in real life with an actual child. Similar to having an attraction to rape or non-consensual bondage, having a sexual attraction to children is different from other forms of sexuality because the subject of that attraction cannot be ethically realized outside of simulated, consensual environments.

      I happen to agree with the way you’ve phrased it here, and I knew there was a possibility that I had misplaced @[email protected]’s intent with their comment, but I think it’s extremely important not to equate the realization of sexual preference for children to the realization of sexual preference for members of the same sex.

      I understand that I’ve been quite abrasive, and the downvotes are probably justified here. But I don’t think there should be any room left for ambiguity when dealing with the explicit sexualization of minors. I think cautioning against CSAM-adjacent material is justified, if only to clearly delineate the ethics of the relationships and acts portrayed in sexual content from the actual practice of those acts on minors.

      It’s a small, possibly the smallest, action against the abuse and trafficking of children, but one that I think is easily the least we could be doing.

      • @[email protected]
        link
        fedilink
        English
        23 months ago

        I 99% agree with what you’re saying here, so I’m not going to comment it line by line ;-)

      • @Gabu
        link
        English
        1
        edit-2
        3 months ago

        And I’ll largely say that you’re right with that, however

        […] but one that I think is easily the least we could be doing. [emphasis mine]

        That is part of the problem, in my view. It is actually the least we could be doing, as in barely more than nothing at all. Hell, it took PH a scandal for them to wipe illegal content from their servers (and as a result nuked quite a few perfectly legal and legitimate creators in the aftermath).