• @[email protected]
    link
    fedilink
    86 days ago

    Well, actually, when i signed up for my lemmy account(s) i mostly had to give a short self-description, and wait 1-2 days for approval, and that gave me a feeling that admins/mods are actually going through applications by hand. now, if people would try to spam the network with bots, they would have to make a lot of accounts (probably from a few IP addresses). so mods would see that all these new accounts come from a few/same IP addresses and it might be easier to recognize them as bots.

    • FaceDeer
      link
      fedilink
      126 days ago

      Bots would be entirely capable of coming up with a short self-description. Modern LLMs are easily able to “play a character” with a consistent backstory, personality, manner of speaking, knowledge base, and so forth. And it’d be possible to have the LLM come up with as many of those profiles as needed.

      Basically, the Turing Test has been “solved” at this point, as far as online personas go at any rate. These comments I’m writing to you right now could be bot-generated. I could literally be a bot. There’s no way to tell.

      And in any event, not all Fediverse instances are as picky. Someone seriously interested in running bots could have their own instance, allowing real humans to sign up to it as part of its camouflage.

        • FaceDeer
          link
          fedilink
          56 days ago

          If they’re the one running the instance it won’t matter.

            • FaceDeer
              link
              fedilink
              35 days ago

              Yes. As I explained above, it would be trivial for it to masquerade as a normal instance. Allow real people to join and it would be a normal instance. How would you detect it as being otherwise?