• @nelly_man
    link
    English
    21
    edit-2
    9 hours ago

    Bit in this context refers to the [Shannon](https://en.wikipedia.org/wiki/Shannon_(unit\)) from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.

    • @GamingChairModel
      link
      English
      10
      edit-2
      9 hours ago

      The paper gives specific numbers for specific contexts, too. It’s a helpful illustration for these concepts:

      A 3x3 Rubik’s cube has 2^65 possible permutations, so the configuration of a Rubik’s cube is about 65 bits of information. The world record for blind solving, where the solver examines the cube, puts on a blindfold, and solves it blindfolded, had someone examining the cube for 5.5 seconds, so the 65 bits were acquired at a rate of 11.8 bits/s.

      Another memory contest has people memorizing strings of binary digits for 5 minutes and trying to recall them. The world record is 1467 digits, exactly 1467 bits, and dividing by 5 minutes or 300 seconds, for a rate of 4.9 bits/s.

      The paper doesn’t talk about how the human brain is more optimized for some tasks over others, and I definitely believe that the human brain’s capacity for visual processing, probably assisted through the preprocessing that happens subconsciously, or the direct perception of visual information, is much more efficient and capable than plain memorization. So I’m still skeptical of the blanket 10-bit rate for all types of thinking, but I can see how they got the number.

    • @[email protected]
      link
      fedilink
      English
      28 hours ago

      Their model seems to be heavily focused on visual observation and conscious problem solving, which ignores all the other things the brain is doing at the same time: keeping the body alive, processing emotions, maintaining homeostasis for several systems, etc.

      These all require interpreting and sending information from/to other organs, and most of it is subconscious.

      • @piecat
        link
        English
        34 hours ago

        It’s a fair metric IMO.

        We typically judge super computers in FLOPS, floating-point-operations/sec.

        We don’t take into account any of the compute power required to keep it powered, keep it cool, operate peripherals, etc., even if that is happening in the background. Heck, FLOPs doesn’t even really measure memory, storage, power, number of cores, clock speed, architecture, or any other useful attributes of a computer.

        This is just one metric.