• tal
    link
    fedilink
    1
    edit-2
    1 year ago

    When talking about computers, it was always 1024.

    The problem is that each time you go up another unit, the binary and decimal units diverge further.

    It rarely mattered much when you’re talking about the difference between kibibytes and kilobytes. In the 1980s, with the size of memory and storage available, the difference was minor, so using the decimal unit was a pretty good approximation for most things. But as we deal with larger amounts of data, the error becomes more-significant.

    Decimal unit Binary unit Divergence
    kilobyte (kB) kibiyte (kiB) 2.4%
    megabyte (MB) mebibyte (MiB) 4.9%
    gigabyte (GB) gibibyte (GiB) 7.4%
    terabyte (TB) tebibyte (TiB) 10.0%
    petabyte (PB) pebibyte (PiB) 12.6%
    exabyte (EB) exbibyte (EiB) 15.3%
    • Alien Surfer
      link
      English
      11 year ago

      This is exactly right. Divergence was small when sizes were small. Good point.