• Primarily0617
    link
    fedilink
    70
    edit-2
    11 months ago

    you have to limit it somewhere or you’re opening yourself up for a DoS attack

    password hashing algorithms are literally designed to be resource intensive

    • @confusedbytheBasics
      link
      English
      5
      edit-2
      11 months ago

      Edited to remove untrue information. Thanks for the corrections everyone.

      • Primarily0617
        link
        fedilink
        13
        edit-2
        11 months ago

        Incorrect.

        They’re designed to be resource intensive to calculate to make them harder to brute force, and impossible to reverse.

        Some literally have a parameter which acts as a sliding scale for how difficult they are to calculate, so that you can increase security as hardware power advances.

        • @confusedbytheBasics
          link
          English
          111 months ago

          I was incorrect but I still disagree with you. The hashing function is not designed to be resource intensive but to have a controlled cost. Key stretching by adding rounds repeats the controlled cost to make computing the final hash more expensive but the message length passed to the function isn’t really an issue. After the first round it doesn’t matter if the message length was 10, 128, or 1024 bytes because each round after is only getting exactly the number of bytes the one way hash outputs.

          • Primarily0617
            link
            fedilink
            211 months ago

            It depends on the hash. E.g., OWASP only recommends 2 iterations of Argon2id as a minimum.

            Yes, a hashing function is designed to be resource intensive, since that’s what makes it hard to brute force. No, a hashing function isn’t designed to be infinitely expensive, because that would be insane. Yes, it’s still a bad thing to provide somebody with a force multiplier like that if they want to run a denial-of-service.

            • @confusedbytheBasics
              link
              English
              111 months ago

              I’m a bit behind on password specific hashing techniques. Thanks for the education.

              My background more in general purpose one way hashing functions where we want to be able to calculate hashes quickly, without collisions, and using a consistent amount of resources.

              If the goal is to be resource intensive why don’t modern hashing functions designed to use more resources? What’s the technical problem keeping Argon2 from being designed to eat even more cycles?

              • Primarily0617
                link
                fedilink
                1
                edit-2
                11 months ago

                Argon2 has parameters that allow you to specify the execution time, the memory required, and the degree of parallelism.

                But at a certain point you get diminishing returns and you’re just wasting resources. It seems like a similar question to why not just use massive encryption keys.

      • @adambowles
        link
        English
        711 months ago

        Hashes are one way functions. You can’t get from hash back to input

        • @taipan_snake
          link
          English
          511 months ago

          Only if the hash function is designed well

        • @confusedbytheBasics
          link
          English
          211 months ago

          True. I was all kinds of incorrect in my hasty typing. I’ll update it to be less wrong.

      • andrew
        link
        fedilink
        English
        311 months ago

        Not true. Password hashing algorithms should be resource intensive enough to prevent brute force calculation from being a viable route. This is why bcrypt stores a salt, a hash, and the current number of rounds. That number of rounds should increase as CPUs get faster to prevent older hashes from existing in the wild which can be more effectively broken by newer CPUs.

        • @confusedbytheBasics
          link
          English
          211 months ago

          I was incorrect about the goal being minimal resources. I should have written that that goal was to have controlled resource usage. The salt does not increase the expense of the the hash function. Key stretching techniques like adding rounds increase the expense to reach the final hash output but does not increase the expense of the hash function. High password length allowances of several thousand characters should not lead to a denial of service attack but they don’t materially increase security after a certain length either.

          • andrew
            link
            fedilink
            English
            211 months ago

            I’m arguing semantics here but bcrypt is the hashing function. Per the Wikipedia article on bcrypt:

            bcrypt is a password-hashing function designed by Niels Provos and David Mazières, based on the Blowfish cipher and presented at USENIX in 1999.

            Blowfish being a symmetric encryption cipher, not a hashing function.

            Agreed on the rest, though. The hashing cost of a long password would not lead to DOS any more than the bandwidth of accepting that password etc. It’s not the bottleneck. But also no extra security beyond a point, so might as well not bother when passwords are too long.

            • @confusedbytheBasics
              link
              English
              211 months ago

              Semantics aside it sounds like we are in agreement. Have another upvote. :)

              Why does upvoting feel better without a karma system? shrug