I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.

This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.

Feedback is very much welcome. Thank you.

      • @smokin_shinobi
        link
        English
        610 months ago

        This is such a strange post and comment section to me. Computers work because of binary.

        • @[email protected]
          link
          fedilink
          English
          110 months ago

          Which nobody uses in the industry because we all know that storage uses base2 prefixes.

        • @[email protected]
          link
          fedilink
          English
          4
          edit-2
          10 months ago

          It’s actually a decimal Vs binary thing.

          1000 and 1024 take the same amount of bytes so 1024 makes more sense to a computer.

          Nothing to do with metric as computers don’t use that. Also not really to do with units.

    • Hyperreality
      link
      fedilink
      5
      edit-2
      10 months ago

      It wasn’t/isn’t. It’s nothing to do with Americans. It was (and often still is) because of binary, as the article mentions.

      2 8 16 32 64 128 256 512 1024.

      So no, kilo is not always a thousand when dealing with computers.

          • @[email protected]
            link
            fedilink
            English
            210 months ago

            I’ve honestly just come to the conclusion that being an asshole about the fact that other countries exist is just the continental past time of Europe.

            Like, Americans get the most of it but they’re like this toward people from other European countries too.