• @[email protected]
      link
      fedilink
      English
      24 months ago

      The type of request is not relevant. It’s the cost of the request that’s an issue. We have long ago stopped serving html documents that are static and can be cached. Tons of requests can trigger complex searches or computations which are expensive server side. This type of behavior basically ruins the internet and pushes everything into closed gardens and behind logins.

      • @Olgratin_Magmatoe
        link
        English
        34 months ago

        It has nothing to do with a sysadmin. It’s impossible for a given request to require zero processing power. Therefore there will always be an upper limit to how many get requests can be handled, even if it’s a small amount of processing power per request.

        For a business it’s probably not a big deal, but if it’s a self hosted site it quickly can become a problem.

        • @[email protected]
          link
          fedilink
          English
          04 months ago

          Caches can be configured locally to use near-zero processing power. Or moved to the last mile to use zero processing power (by your hardware)

          • @Olgratin_Magmatoe
            link
            English
            3
            edit-2
            4 months ago

            Near zero isn’t zero though. And not everyone is using caching.

            • @[email protected]
              link
              fedilink
              English
              04 months ago

              Right, thats why I said you should fire your sysadmin if they aren’t caching or can’t manage to get the cache down to zero load for static content served to simple GET requests

              • @Olgratin_Magmatoe
                link
                English
                14 months ago

                Not every GET request is simple enough to cache, and not everyone is running something big enough to need a sysadmin.