• @Asudox
    link
    530 days ago

    Not sure if that is effective at all. Why would a crawler check the robots.txt if it’s programmed to ignore it anyways?

    • ɐɥO
      link
      fedilink
      1630 days ago

      cause many crawlers seem to explicitly crawl “forbidden” sites

    • @Crashumbc
      link
      English
      329 days ago

      Google and script kiddies copying code…