• asudox
    link
    54 months ago

    Not sure if that is effective at all. Why would a crawler check the robots.txt if it’s programmed to ignore it anyways?

    • ɐɥO
      link
      fedilink
      164 months ago

      cause many crawlers seem to explicitly crawl “forbidden” sites

    • @Crashumbc
      link
      English
      34 months ago

      Google and script kiddies copying code…