• @[email protected]
    link
    fedilink
    English
    31 day ago

    You’re misunderstanding the terminology used then.

    In information theory, “bit” doesn’t mean “bitrate” like you’d see in networks, but something closer to “compressed bitrate.”

    For example, let’s say I build a computer that only computes small sums, where the input is two positive numbers from 0-127. However, this computer only understands spoken French, and it will ignore anything that’s not a French number in that range. Information theory would say this machine receives 14 bits of information (two 7-bit numbers) and returns 8 bits. The extra processing of understanding French is waste and ignored for the purposes of calculating entropy.

    The article also mentions that our brains take in billions of bits of sensory data, but that’s ignored for the calculation because we only care about the thought process (the useful computation), not all of the overhead of the system.

    • @Buffalox
      link
      English
      01 day ago

      I think I was pretty clear about a understanding or comprehension part, which is not merely input output.