• @petunia
    link
    English
    45
    edit-2
    7 months ago

    Speaking from experience, they could fix their spam and abuse woes very easily by just closing new signups or restricting it in some way. Simplest would be invite-only (built-in feature of Mastodon), or restrict the signups page based on IP range whitelist/blacklist.

    EDIT: Their domain has been reinstated, and they disabled open signups. New registrations now require moderator approval https://pawoo.net/@pawoo_support/111249170584706318

    :pawoo: Announcement! Thank you for always using Pawoo. Due to server congestion, new registrations will now require approval by a moderator. Thank you very much for your cooperation.

    • ShittyKopper [they/them]
      link
      fedilink
      English
      29
      edit-2
      7 months ago

      you mean they should turn off the user-count-go-uppinator? but how would the fediverse grow like that?? /s

      (sorry i just have opinions on large instances)

      • @[email protected]
        link
        fedilink
        English
        20
        edit-2
        7 months ago

        The nature of a site like reddit (and lemmy and all other alternatives) means a large userbase is necessary. Not at the expense of CSAM, but also everyone trying to gatekeep lemmy doesn’t realise that we’ll die a slow death unless the fediverse grows by at least 10-20x.

        • @themusicman
          link
          English
          47 months ago

          Doesn’t have to be all on one instance though. Instances shouldn’t grow beyond what they can admin/moderate

      • @[email protected]
        link
        fedilink
        English
        17 months ago

        I swear, Lemmy users act like they are completely in favor of an instance of 40 users and a submission every other week. Seeing scale as inherently bad.

        Lemmy will fucking roll over and die within the year.

      • @petunia
        link
        English
        97 months ago

        Absolutely brain-dead speculation based on literally nothing. Complicit in what??? The current owner is a very public figure, so they gain nothing and have everything to lose. It’s just pure incompetence and mismanagement.

  • @soren446
    link
    English
    30
    edit-2
    3 months ago

    deleted by creator

  • Dame
    link
    fedilink
    English
    12
    edit-2
    7 months ago

    Why are there people downvoting people commenting about not wanting CSAM?

  • @Darkhoof
    link
    English
    37 months ago

    What does the acronym CSA mean? I’m not native English.

    • Clay_pidgin
      link
      fedilink
      English
      147 months ago

      “Child Sexual Abuse Material”. It’s an awkward acronym that’s mostly overtaken “Child Pornography”.

    • @endhits
      link
      English
      147 months ago

      CSAM is Child Sexual Exploitation Material

      People prefer using this term over CP because the word “porn” is considered too soft. Porn is generally a consensual, adult medium made with adults for adults. CP is not that, it’s first and foremost harm of a child.

      • @[email protected]
        link
        fedilink
        English
        137 months ago

        Another reason is that “CP” got jokingly coopted by abusers in the form of various dogwhistles (e.g.: “cheese pizza”). It made more sense to adopt a new acronym rather than try to uphold any sense of decorum while sharing ownership of the term w/ edgelords & predators.

        • Amju Wolf
          link
          fedilink
          English
          77 months ago

          I mean it’s more accurate and unambiguous. Definitely preferable overall IMO.

      • @[email protected]
        link
        fedilink
        English
        6
        edit-2
        7 months ago

        That’s strange to me, because Child Porn sounds revolting to me but CSAM sounds like something I can treat with aspirin.

        • @endhits
          link
          English
          17 months ago

          CSAM is a pretty sterile term so I can see where that feeling comes from.

    • @[email protected]
      link
      fedilink
      English
      157 months ago

      Nah. They’re lax about loli. Which, as distasteful as it is, does not involve any harm to actual children. They do go after actual CSAM.

      • Dame
        link
        fedilink
        English
        27 months ago

        Not really, they’re lax about CSAM and didn’t even have laws until 2016-2017. Even then the laws are lax

      • @nandeEbisu
        link
        English
        07 months ago

        They go after it mainly to appease external forces like other countries objecting to it, but people who are convicted often get very light sentences.