• Dave.
    link
    fedilink
    61 month ago

    I’m guessing something like:

    Robots.txt: Do not index this particular area.

    Main page: invisible link to particular area at top of page, with alt text of “don’t follow this, it’s just a bot trap” for screen readers and such.

    Result: any access to said particular area equals insta-ban for that IP. Maybe just for 24 hours so nosy humans can get back to enjoying your site.

    • @doodledup
      link
      21 month ago

      Problem is that you’re also blocking search engines to index your site, no?

      • ɐɥO
        link
        fedilink
        81 month ago

        Nope. Search engines should follow the robots.txt

        • @doodledup
          link
          -21 month ago

          You misunderstand. Sometimes you want your public website to be indexed by search engines but not scraped for the next LLM model. If you disallow scraping alltogether, then you won’t be indexed on the internet. That can be a problem.

          • ɐɥO
            link
            fedilink
            71 month ago

            I know that. Thats why I dont ban everyone but only those who dont follow the rules inside my robots.txt. All “sane” search engine crawlers should follow those so its no problem

      • mox
        link
        fedilink
        5
        edit-2
        1 month ago

        Robots.txt: Do not index this particular area.

        Problem is that you’re also blocking search engines to index your site, no?

        No. That’s why they wrote “this particular area”.

        The point is to have an area of the site that serves no purpose other than to catch bots that ignore the rules in robots.txt. Legit search engine indexers will respect directives in robots.txt to avoid that area; they will still index everything else. Bad bots will ignore the directives, index the forbidden area anyway, and by doing so, reveal themselves in the server logs.

        That’s the trap, aka honeypot.