Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.

==

A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.

I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.

I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.

  • @Dagnet
    link
    English
    61 year ago

    Still have no idea which community it was

    • DMmeYourNudes
      link
      English
      341 year ago

      apparently it was adorableporn, whos rules explicitly state you can not pretend to be a child.

      • AdaOPM
        link
        fedilink
        English
        01 year ago

        Their sidebar rules actively allowed “childlike” content before this occurred.

        • DMmeYourNudes
          link
          English
          321 year ago

          yet none of the top posts are child like.

          • AdaOPM
            link
            fedilink
            English
            -11 year ago

            Well you know, except for the one that started this whole thing, and the first couple I saw when I had a closer look at the community…

            • DMmeYourNudes
              link
              English
              121 year ago

              That does not seem to be the prevailing opinion from people who have actually seen the content.

              • AdaOPM
                link
                fedilink
                English
                -11 year ago

                People who use a community looking for “child like” content from their models probably aren’t the unbiased sample group you think they are

                • @CaptainEffort
                  link
                  English
                  9
                  edit-2
                  1 year ago

                  I’m a little confused. Above, a comment went really into detail as to why they believe this was just a misunderstanding. One thing they mentioned was this:

                  The mod of the community copy/pasted the dictionary definition from vocabulary.com, which contains the word “childlike”.

                  Now… you actually replied to the comment that I’m quoting from. So you saw that the only reason that “childlike” was included was because it was copy/pasted - not because of anything malicious. So, did you not actually read their whole comment? Or did you read it and decide to continue with this talking point despite knowing it’s flawed?

                  • AdaOPM
                    link
                    fedilink
                    English
                    21 year ago

                    I looked at something that looked like CSAM. I looked at the community, and saw more content that was sitting right on the border of CSAM. The sidebar told me childlike content was allowable.

                    There was no misunderstanding. My exposure to the content in that community was content that appeared to my eyes to be CSAM.

                    Telling me that if I looked hard enough I would see that they’re all adults, or that “childlike” was only there because of a copy and paste doesn’t fundamentally alter that. I don’t care that they’re all adults. I care that many of them are framed to look like they’re not.

                    This is not a misunderstanding. This isn’t ignorance. This isn’t confusion.

                    This is a difference in expectations and understanding of acceptable content.

                • DMmeYourNudes
                  link
                  English
                  51 year ago

                  their most popular content is not child like, they changed their rules, and somehow you don’t see that you too are biased. and again, no one in this thread has seen the post that you started this over.

                  • AdaOPM
                    link
                    fedilink
                    English
                    21 year ago

                    Of course I’m biased. I haven’t tried to hide that. I’m not approaching this from a dispassionate, disconnected perspective and have no desire to pretend I am. I had a strong emotional response to something that looked like CSAM. I saw more content that looked like CSAM. I saw a sidebar that actively encourages childlike content. I got told by the admins of the community that it was fine with them.

                    So I defederated, and it doesn’t impact you at all, as you’re not on either instance.

                    And I would do so again. And your snarky replies aren’t going to change that.

            • ghostinthessh
              link
              fedilink
              1
              edit-2
              1 year ago

              This is mostly unrelated to defederating from lemmynsfw.com . However is is possible for next time to take screenshots or archives of the other instance’s rules and relavant policy decisions by admins? I understand not wanting to screenshot the content, and I trust your judgement on things. But it seems like rules/policies get changed after you observe and act on them (disingenuous or otherwise), so having a recording to show what you observed would be helpful. Even to people outside blahaj as we may care about those changes, and want to observe a pattern.

              Otherwise, thanks for running an instance, and taking some time to actually care about and moderate the content.