Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @[email protected] the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

    • @DarthBueller
      link
      91 year ago

      People are downvoting you because you’re acting like a dick.

    • Cosmic Cleric
      link
      English
      21 year ago

      If you’re a pedophile and disagree with me - instead of downvoting, why not explain yourself?

      People have been, but you’re not truly listening, Internet Warrior.

        • Cosmic Cleric
          link
          English
          11 year ago

          Whatever you say, kid.

          I’m over 50, but you keep doing you, Internet Warrior, as it just proves my point.

            • Cosmic Cleric
              link
              English
              11 year ago

              Your age does not come across in your writing.

              Well, lets see …

              People have been, but you’re not truly listening

              That sounds to you like a sentence a young person would say, punctuation and all?

              • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
                link
                01 year ago

                Yes.

                It doesn’t to you? How young do you think these people are? If you’re 50, there’s 49 different ages which are younger than you. Not all of them know how to write like that, but I would say at least the ones with a high school education do.

                • Cosmic Cleric
                  link
                  English
                  11 year ago

                  Yes. It doesn’t to you?

                  These days especially, I see very few young people who bother with things like commas, multiple paragraphs, or using words like “truly”.

                  • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
                    link
                    11 year ago

                    You may want to keep your day job, and not move into the online-age-detecting sector, truly.

                    Sorry for miss-generationing you.

    • @[email protected]
      link
      fedilink
      21 year ago

      I agree with a lot of what you said and upvoted you but you really need to just stop calling people pedos for disagreeing with you.

      I’m a victim of CSAM myself and you can take a look through my comment history where I talked about it in depth more. I hate pedos just as much as you do but going around calling people pedos isn’t going to do anything but upset people.

      • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
        link
        01 year ago

        I’m taking the radical stance that CSAM isn’t a good thing, should be reported to law enforcement and that the site with CSAM can be shut down as a viable option for handling CSAM material.

        I’m getting downvotes from people who disagree with me on this “radical” stance. People who disagree that CSAM is a problem, that CSAM is a concern. I don’t have a lot of sympathy for people who promote CSAM like the people who downvoted my posts. I don’t care about the loss of internet points, I care that these worthless shits are still on lemmy, so yes, I call them what they are.

        • @[email protected]
          link
          fedilink
          21 year ago

          I mean, I think people are downvoting you for other reasons.

          Obviously I agree with you that CSAM is bad. It happened to me and ruined my fucking life for like all of my teen years and then most of my early 20s.

          But calling people names is pointless. Especially when it comes off like a baseless accusation.

          • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
            link
            21 year ago

            Noted. You’ll have to excuse the fact that I don’t really care about calling people names on the internet if the content of their message promotes abuse.

            • @[email protected]
              link
              fedilink
              31 year ago

              Yeah I get that for sure. I mean, if I knew someone was some kind of MAP idiot who was trying to fight for the rights of pedos, I’d call them names too. Idiot seems fitting for that lol

        • newIdentity
          link
          fedilink
          11 year ago

          You’re completely misinterpreting everything we said. If we would shutdown every site with CSAM, the internet wouldn’t exist. We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

          • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
            link
            11 year ago

            You’re completely misinterpreting everything we said.

            Not at all. I am completely underestanding you.

            If we would shutdown every site with CSAM, the internet wouldn’t exist.

            You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

            We don’t disagree that CSAM isn’t a problem. We disagree with your solution.

            My solution which is to remove CSAM? My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

            Another question for you: if your house is flooding due to a burst pipe, what do you do first:

            a) get all the water out of the house b) turn off the water coming into the house.

            my solution would be to do step B followed by step A. Your solution appears to be to just do step A, which means you’ll constantly be flooded and never have enough manpower to dry your house.

            I’d bet money that the following will happen:

            1. community gets turned off
            2. csam gets deleted, posters are identified, information turned over to law enforcement
            3. community gets turned back on.

            In the meantime, folks missing the community are free to go elsewhere on the internet. Why? because CSAM is a crime which depicts Sexual Assult and the evidence is posted online. It’s not a matter of just deleting content, it’s also a matter of turning over the people posting that content over to the police so they can be held accountable for their crimes.

            • newIdentity
              link
              fedilink
              11 year ago

              You are wrong. My site doesn’t have CSAM. Lots of other sites don’t have CSAM. The internet isn’t just for CSAM. You must be smarter than this.

              Sorry let me word this correctly: social media wouldn’t exist.

              My solution to turn off communities while the CSAM issue is cleaned up? What about those solutions do you disagree with?

              No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

              The problem is that it won’t stop and that it will happen again.

              I’d bet money that the following will happen:

              1. community gets turned off
              2. csam gets deleted, posters are identified, information turned over to law enforcement
              3. community gets turned back on.

              You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

              Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

              It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

              • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
                link
                11 year ago

                Sorry let me word this correctly: social media wouldn’t exist.

                And this is hardly the argument you think it is. Again, not true of all social media sites, but let’s strongman your argument for a moment and say that you are refering to only the major social media sites.

                Well then, we have a problem, don’t we? What’s something the major social media sites have that lemmy doesn’t? Ad revenue, to the tune of millions of dollars. What do they do with that revenue? Well, some if it goes to pay real humans who’s entire job is simply seeking out and destroying CSAM content on the site.

                So then how does lemmy, with only enough money to pay hosting costs, if that… deal with CSAM when a user wants to create a botnet that posts CSAM to lemmy instances all day? My answer is: the admins do whatever they think is nessecary, including turning off the community for a bit. They have my full support in this.

                No your solution is to permanently shut down Lemmy since there is the possibility of being CSAM on one instance. The community it’s posted in doesn’t matter. They can just keep spamming CSAM and the mods can’t do anything about it except shutting down the instance/community. Unless there are better tools to moderate. That’s basically what everyone wants. We want better tools and more automation so the job gets easier. It’s better to have a picture removed because of CSAM that is wrongly flagged than not removing one that is CSAM.

                You’re strawmanning my argument. I’ve never said forever. I’ve said while the community gets cleaned up. I’ve even described a timeline below.

                The better tools you want to moderate are your own eyeballs. I’ve said this before but there have been many attempts at making automated CSAM detection material and they just don’t work as well as needed, requiring humans to intervene. These humans are paid by major social media networks but not volunteer networks.

                The problem is that it won’t stop and will happen again.

                Yes, this is the internet! No one has a solution to stop CSAM from happening. We aren’t discussing that. We are discussing how to handle it WHEN it happens.

                You’re wrong at step 2 The posters Might’ve used Tor which basically makes it impossible to identify them. Also in most cases LE doesn’t do shit. So the spamming won’t stop (unless someone other than LE does something against it). We can’t only relay on LE to do their job. We need better moderation tools.

                No, I’m correct about step 2, which I described as: “csam gets deleted, posters are identified, information turned over to law enforcement”

                I’ll break it down further:

                1. CSAM gets deleted from the instance. Admins and mods can do this, and they do this already.
                2. posters are identified. Admins and mods can do this, and might do this already. TO BE CLEAR, they can identify the users by IP address and user agent, that’s about it. The rest of it… is…
                3. “information turned over to law enformcement” … left up to law enforcement. “Hello police, I’m the owner of xyz.com and today a user at 23.43.23.22 posted CSAM on my site at this time. The user has been banned and have given you all the information we have on this. The cops can get a warrant for the ISP and go from there.

                Oh yeah, TOR. well, we’re getting deep off topic here but go on youtube and see some defcon talks about how TOR users are identified. You may think you’re slick going on TOR but then you open up facebook or check your gmail and it’s all over.

                Either way, I’m not speaking to the success of catching CSAM posters, I’m only speaking to what the admins likely are doing already, which is probably true.

                Also even if the community is turned back on, what’s stopping someone from doing it again? This time maybe a whole instance?

                Nothing, which is why social media sites dedicate teams of mods to handle this exact thing. It’s a cat and mouse game. But not playing the game and not trying to remove this content means the admins face legal trouble.

                It’s simply too easy to spam child porn everywhere. One instance of CP is much easier to moderate than thousands.

                This makes no sense to me. What was your point? Yes, one image is easier to delete than thousands of images. I don’t see how that plays into any of what we have been discussing though.

                • newIdentity
                  link
                  fedilink
                  1
                  edit-2
                  1 year ago

                  I don’t want to write a long text so here is the short version: These automated tools are not perfect but they don’t have to be. They just have to be good enough to block most of it. The rest can be done through manual labor which also people have done voluntarily on reddit. Reporting needs to get easier and you can prevent spammers from rate limiting them.

                  To be clear, I don’t have anything against temporarly shutting down a community filled with CP until everything is cleared up. But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.

                  I’m sorry for the grammatical mistakes. I’m really tired right now and should probably go to bed.

                  • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
                    link
                    21 year ago

                    i agree with most of what you’ve written, just one small issue:

                    The rest can be done through manual labor which also people have done voluntarily on reddit.

                    You’re probably right that some volunteers handle this content on reddit. By this I mean, mods are volunteers and sometimes mods handle this content.

                    My point however has been that big social media sites can’t rely on volunteers to handle this content. Reddit, along with facebook and other major sites (but not twitter, as elon just removed this team) has a team of people who pick up the slack where the automated tools leave off. These people are paid, and usually not well, but enough so that it’s their job to remove this content (as opposed to it being a volunteer gig they do on the side). I’ll say that again: these people are paid to look at photographs of CSAM and other psychologically damaging content all day, usually for pennies.

                    But we need better solutions to make it easier in the future so it doesn’t need to go this far and be more manageable.

                    I fully agree with you. It’s just, as a dev, who has toyed around with AI and has been working on code for decades now, I don’t see a clear path forward. I am also not an expert in these tools, so I can’t speak specifically to how well they work. I can only say that they don’t work so well that humans are not required. Ideally, we want tools that work so well humans won’t be required (as it’s a psychologically damaging job), but at the sametime, we don’t want legit users to be misflagged either. The other day there was a link posted to hackerne.ws by a youtube creator who keeps needing to reenable comments on her shorts. The youtube algorithm keeps disabing comments on her shorts because it thinks there’s a child in the video - it’s only ever been her and while she is petite in stature, she’s also 30 years old. She’s been reaching out to youtube for over 3-4 years now and they still haven’t fixed the issue. Each video she uploads she needs to turn on comments manually, which affects her engagement. While nowhere near comparible to the sin of CSAM, it’s also not right for a legit user to be penalized just because of the way she looks - because the algorithm cannot properly determine her age.

                    Youtube is a good example of how difficult it is to moderate something like this. A while ago, youtube revealed that “a years-worth of content is uploaded every minute” (or maybe it was every hour? still)… Consider how many people would be required to watch every minute of uploaded video, multiplied by each minute in their day. Youtube requires automated tools, and community reporting, and likely also has a team of mods. And it’s still imperfect.

                    So to be clear, you’re not wrong, it’s just a very difficult problem to solve.