We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.

  • @[email protected]
    link
    fedilink
    English
    7
    edit-2
    3 days ago

    It is not possible to detect bots. Attempting to do so will invariably lead to false positives denying access to your content to what is usually the most at-risk & marginalized folks

    Just implement a cache and forget about it. If read only content is causing you too much load, you’re doing something terribly wrong.

    • Billegh
      link
      English
      22 days ago

      While I agree with you, the quantity of robots has greatly increased of late. While still not as numerous as users, they are hitting every link and wrecking your caches by not focusing on hotspots like humans do.

        • Billegh
          link
          English
          52 days ago

          Sure thing! Help me pay for it?

    • @[email protected]
      link
      fedilink
      English
      0
      edit-2
      2 days ago

      False positives? Meh who cares … That’s what appeals are for. Real people should realize after not too long

      • @[email protected]
        link
        fedilink
        English
        22 days ago

        Every time I tried to appeal, I either got no response or met someone who had no idea what I was talking about.

        Occasionally a bank fixes the problem for a week. Then its back again.