Thanks to @[email protected] for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
It’s also been pointed out that they are using ‘bit’ in a way people here are not thinking they are using it: https://lemmy.world/comment/14152865
Oh boy.
Which is exactly what bit means.
Which is not bits, but the equivalent 1 digit at base 10.
This just shows the normal interpretation of bits.
If it’s used as units of information you need to specify it as bits of information. Which is NOT A FREAKING QUANTIZED unit!
And is just showing the complete uselessness of this piece of crap paper.
I’m interested in what you mean. Could you ELI5 why bits of information can’t be used here?
I suppose it can, but just calling it bits is extremely misleading. It’s like saying something takes 10 seconds, but only if you are traveling 90% at the speed of light.
It such extremely poor terminology, and maybe the article is at fault and not the study, but it is presented in a way that is moronic.
Using this thermodynamics definition is not generally relevant to how thought processes work.
And using a word to mean something different than it usually does BEFORE pointing it out is very poor terminology.
And in this case made them look like idiots.
It’s really too bad, because if they had simply stated we can only handle about 10 concepts per second, that would have been an entirely different matter, I actually agree is probably right. But that’s not bad IMO, that’s actually quite impressive! The exact contrary of what the headline indicates.
I get your argument now. Do note that this entropy is about information theory and not thermodynamics, so I concur that the Techspot article is at fault here.
Thanks. ;)
https://en.wikipedia.org/wiki/Information_theory
Meaning it’s based on thermo dynamics.
And incidentally I disagree with both. Information theory assumes the universe is a closed system, which is a requirement for thermodynamics to work. which AFAIK is not a proven fact regarding the universe and unlikely IMO.
2nd law of thermodynamics (entropy) is not a law but a statistical likelihood, and the early universe does not comply, and the existence of life is also a contradiction to the 2nd law of thermodynamics.
I have no idea how these ideas are so popular outside their scope?
Information theory is an accepted field. The entropy in information theory is analogous and named after entropy in thermodynamics, but it’s not actually just thermodynamics. It looks like its own study. I know this because of all the debate around that correcthorsebatterystaple xkcd.
I’m not sure if you are making a joke, or also making a point. But boy that XKCD is spot on. 😋 👍
I think within it’s field thermodynamics works, but it’s so widely abused outside the field I’ve become sick of hearing about it from people who just parrot it.
I have not seen anything useful from information theory, mostly just nonsense about information not being able to get lost in black holes. And exaggerated interpretations about entropy.
So my interest in information theory is near zero, because I have discarded it as rubbish already decades ago.
For one, password security theory that actually works (instead of just “use a special character”) is based on information theory and its concept of entropy.