I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.

This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.

Feedback is very much welcome. Thank you.

  • @[email protected]
    link
    fedilink
    English
    4
    edit-2
    11 months ago

    It’s actually a decimal Vs binary thing.

    1000 and 1024 take the same amount of bytes so 1024 makes more sense to a computer.

    Nothing to do with metric as computers don’t use that. Also not really to do with units.