Thanks to @[email protected] for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
Bit in this context refers to the [Shannon](https://en.wikipedia.org/wiki/Shannon_(unit\)) from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.
The paper gives specific numbers for specific contexts, too. It’s a helpful illustration for these concepts:
A 3x3 Rubik’s cube has 2^65 possible permutations, so the configuration of a Rubik’s cube is about 65 bits of information. The world record for blind solving, where the solver examines the cube, puts on a blindfold, and solves it blindfolded, had someone examining the cube for 5.5 seconds, so the 65 bits were acquired at a rate of 11.8 bits/s.
Another memory contest has people memorizing strings of binary digits for 5 minutes and trying to recall them. The world record is 1467 digits, exactly 1467 bits, and dividing by 5 minutes or 300 seconds, for a rate of 4.9 bits/s.
The paper doesn’t talk about how the human brain is more optimized for some tasks over others, and I definitely believe that the human brain’s capacity for visual processing, probably assisted through the preprocessing that happens subconsciously, or the direct perception of visual information, is much more efficient and capable than plain memorization. So I’m still skeptical of the blanket 10-bit rate for all types of thinking, but I can see how they got the number.
Their model seems to be heavily focused on visual observation and conscious problem solving, which ignores all the other things the brain is doing at the same time: keeping the body alive, processing emotions, maintaining homeostasis for several systems, etc.
These all require interpreting and sending information from/to other organs, and most of it is subconscious.
It’s a fair metric IMO.
We typically judge super computers in FLOPS, floating-point-operations/sec.
We don’t take into account any of the compute power required to keep it powered, keep it cool, operate peripherals, etc., even if that is happening in the background. Heck, FLOPs doesn’t even really measure memory, storage, power, number of cores, clock speed, architecture, or any other useful attributes of a computer.
This is just one metric.