Scientists Calculate the Speed of Thought

  • Thread starter jedishrfu
  • Start date
  • Tags
    Speed
  • #1
15,145
9,708
https://www.technologynetworks.com/...-quantified-the-speed-of-human-thought-394395

Caltech researchers have quantified the speed of human thought: a rate of 10 bits per second. However, our bodies' sensory systems gather data about our environments at a rate of a billion bits per second, which is 100 million times faster than our thought processes. This new study raises major new avenues of exploration for neuroscientists, in particular: Why can we only think one thing at a time while our sensory systems process thousands of inputs at once?

The research was conducted in the laboratory of Markus Meister (PhD '87), the Anne P. and Benjamin F. Biaggini Professor of Biological Sciences, and it was led by graduate student Jieyu Zheng. A paper describing the study appears in the journal Neuron on December 17.

A bit is a basic unit of information in computing. A typical Wi-Fi connection, for example, can process 50 million bits per second. In the new study, Zheng applied techniques from the field of information theory to a vast amount of scientific literature on human behaviors such as reading and writing, playing video games, and solving Rubik's Cubes, to calculate that humans think at a speed of 10 bits per second.

So there you have it! We humans think at 10 bits per second, slower than a slow-moving locomotive, slower than a 1960s Model 30 teletype, and slower than molasses. It's likely about the same as it takes to print one letter of your name.

A curious question: Is there a correlation between higher IQ and faster thinking? If so, how much?
 
  • Skeptical
  • Like
Likes Barbarian, dwarde and PeroK
Physics news on Phys.org
  • #2
jedishrfu said:
We humans think at 10 bits per second
I read the quote but not the article (yet). Can you give the cliff notes on how they got to 10b/s? That seems way low to me...
 
  • Like
Likes Filip Larsen and phinds
  • #3
  • Like
Likes jedishrfu
  • #4
berkeman said:
I read the quote but not the article (yet). Can you give the cliff notes on how they got to 10b/s? That seems way low to me...
I would think that even when we're asleep, our brains process at a faster rate than that. Maybe their test subjects were comatose.
 
  • #5
They must be using an unusual definition of a bit.
 
  • Like
Likes Filip Larsen
  • #6
This sounds a lot like "how fast can a human type out random letters read from a screen".

Or "how fast can a human translate human speech to another language if we reduce the raw bit rate based on a good compression algorithm".

I can type at about 10 bits per second. But that is with English text. So only about 1 bit per second after compression.
 
  • #7
jedishrfu said:
So there you have it! We humans think at 10 bits per second, slower than a slow-moving locomotive, slower than a 1960s Model 30 teletype, and slower than molasses. It's likely about the same as it takes to print one letter of your name.
Faster than me! :biggrin:
Love and Peace
Marcos
 
  • Like
Likes jedishrfu
  • #8
That bit rate seems ridiculously low. For example, as a competent amateur pianist and accompanist I can read, understand and play music at many notes per second in both hands, including shaping the music and making adjustments in response to other players in the ensemble. Are they saying that doesn't involve thinking?
 
  • Like
Likes BillTre, berkeman and PeroK
  • #9
We can absorb millions of bits of information through our senses but can only process them at a very slow rate.
 
  • Skeptical
Likes PeroK
  • #10
That must be a very weird definition of "thought". Activities like tennis, dancing or playing a musical instrument surely require very much higher bit rates, and involve making complex decisions very rapidly.

For a keyboard musical instrument, one can even get a good idea of the number of bits necessary to represent the performance by generating a MIDI file; it's something like 10 bytes per note (including start and stop timing information, pitch and velocity). It could admittedly be compressed somewhat, and when I'm sight-reading I can't handle as much detail as in stuff I've mostly memorised, but the music currently open on my piano (Mendelssohn cello sonata 2, first movement) frequently has 42 notes per second (as 6 chords of 7 notes) or 24 per second as sequences of 12 in each hand. There is some "pattern recognition" going on as many sequences of notes follow familiar patterns, but there are so many possible patterns that they would still need many bits to represent them.

I suspect that whatever experiments were conducted actually measured something more equivalent in computer terms to latency. When we focus on a stream of information, there is a delay in processing it but the processing rate is very high. However, when the information is not provided until after the previous processing is complete, the processing time for each piece of information is extended by that delay, so the apparent rate is very much lower.
 
  • Like
Likes PeroK
  • #11
jedishrfu said:
We can absorb millions of bits of information through our senses but can only process them at a very slow rate.
This sounds like nonsense to me. As mentioned above, it would make reading music and playing the piano practically impossible. How could you do this at 10 bits per second? I guess you could only read a binary number at 10 bits per second, but with 26 letters, 26 capitals, numbers, punctuation, it would take you one second to read a single character. It makes no sense. It just sounds like a BS number to me.
 
  • Like
Likes BillTre
  • #12
PeroK said:
This sounds like nonsense to me. As mentioned above, it would make reading music and playing the piano practically impossible. How could you do this at 10 bits per second? I guess you could only read a binary number at 10 bits per second, but with 26 letters, 26 capitals, numbers, punctuation, it would take you one second to read a single character. It makes no sense. It just sounds like a BS number to me.

They give several examples in the paper. One is a for an advanced typist producing 120 words per minute. They calculate that to be 10 bits/second after accounting for the redundancy in language. Other examples result in slightly higher numbers but not by much. They're calculating the information that's on the page. If you did the same with a piano what would the bit rate be?

I think this is an oversimplification of what is going on in the human mind when typing or playing the piano. I can't speak for anyone else but when I'm typing (like right now) I'm also thinking of other things in a less focused way. I'm feeling things, both physically and emotionally. I'm ill equipped to calculate the bit rates of those mental experiences. Maybe these guys are too.

Or maybe they are right in a way. I can certainly believe that conscious thought is a fairly limited, focused thing. Still, there's a lot going on under the hood.
 
  • Like
Likes BillTre
  • #13
JT Smith said:
They give several examples in the paper. One is a for an advanced typist producing 120 words per minute. They calculate that to be 10 bits/second after accounting for the redundancy in language. Other examples result in slightly higher numbers but not by much. They're calculating the information that's on the page. If you did the same with a piano what would the bit rate be?
Perhaps the output is 10 bits per second, but that can't be the processing speed. It can't be possible to answer general knowledge questions without a sophisticated search and retrieval system. Let alone do complex non-repetitive tasks. The speed of input/output may be 10 bits per second, but it seems obvious that the total processing speed must be much higher. Theirs seems to be a calculation of useful information being produced. But, the total amount of thinking must include all the processing, trial and error, search, retrieval, comparison etc. needed to produce that information.

For example, I was given some trivia questions recently. The first question was "what is the only station on the London underground with a 'z' in its name". I answered "Belsize Park" almost before the question was complete. Okay, "Belsize Park" is perhaps only 10 bits or so. But, how on Earth did I get that answer at 10 bits per second? The authors seem to imagine I just read it off an answer card in front of me.

The paper is nonsense, IMHO.
 
  • Like
Likes BillTre and berkeman
  • #14
PeroK said:
Perhaps the output is 10 bits per second, but that can't be the processing speed...

...The paper is nonsense, IMHO.

I don't disagree. But I think there is a point to be made about a certain type of thought. We are pretty limited in some ways.
 
  • #15
From what I have seen and read, we don't process information at a fixed speed.

I remember reading about one experiment where subjects wore a headset and then were dropped in a bungy jump. While they were falling they were able to mentally process more images shown on the headset than when not falling. The stress was said to allow them to process information more quickly. I couldn't find that particular example but did find some seemingly related information.

When we’re put under pressure, our brains can suddenly process information much faster – but only in certain situations, says neuroscientist Tali Sharot.
Some of the most important decisions you will make in your lifetime will occur while you feel stressed and anxious.

https://www.bbc.com/future/article/20180613-why-stressed-minds-are-better-at-processing-things


Abstract

The present study tested the intrinsic ERP features of the effects of acute psychological stress on speed perception. A mental arithmetic task was used to induce acute psychological stress, and the light spot task was used to evaluate speed perception. Compared with judgments in the constant speed and uniform acceleration motion, judgments in the uniform deceleration motion were made more quickly and with higher accuracy; attention control was higher and peaked later; and there was longer N2 peak latency, larger N2 peak amplitude, and lower mean amplitude of the late negative slow wave (SW). Under stress, the reaction time was significantly shorter. The N2 peak amplitude and SW mean amplitude were significantly higher, attention control was higher and appeared earlier, and there was a greater investment of cognitive resources. The type of movement and evoked stress also interacted to predict behavioral and ERP measures. Under acute stress, judgments made in the uniform deceleration motion condition elicited lower N2 peak latency, higher attention control, and later peak attention. The results suggest that judgments of the speed of decelerating motion require a lower investment of cognitive resources than judgments of other kinds of motion, especially under acute stress. These findings are best interpreted in terms of the interaction of arousal and attention.
https://pmc.ncbi.nlm.nih.gov/articles/PMC10046828/
 
  • #16
I think it’s BS. We can look at a multi-megapixel image and recognize multiple elements in it within milliseconds. If a pixel is encoded with just 16 bits, we’re talking about 10^8 bps at least.

Another example: a driver entering an intersection sees a car run a red and begin to cross in front of her. She analyzes situation (car! Illegal!), figures out consequences (danger!), decides on course of action (brake hard and swerve to right—not left because of oncoming traffic), and initiate action (lift foot from accelerator to brake pedal and turn steering wheel). An alert driver does this in 1/3 to 1/2 second. How many megabits per second do you think that is? or is it Tb/s?
 
Last edited:
  • Like
Likes berkeman, PeroK and BillTre
  • #17
marcusl said:
We can look at a multi-megapixel image and recognize multiple elements in it within milliseconds. If a pixel is encoded with just 16 bits, we’re talking about 10^8 bps at least.

Do you really think you are capable of perceiving 16 bits per pixel? Even for just one pixel that's hard to believe never mind thousands of them. A jpeg reduces the number of bits via compression without my noticing much difference, if any.

Here's what the authors of the paper had to say about visual processing:

Many of us feel that the visual scene we experience, even from a glance, contains vivid details everywhere. The image feels sharp and full of color and fine contrast. If all these details enter the brain, then the acquisition rate must be much higher than 10 bits/s.

However, this is an illusion, called "subjective inflation" in the technical jargon. People feel that the visual scene is sharp and colorful even far in the periphery because in normal life we can just point our eyes there and see vivid structure. In reality, a few degrees away from the center of gaze our resolution for spatial and color detail drops off drastically, owing in large part to neural circuits of the retina. You can confirm this while reading this paper: fix your eye on one letter and ask how many letters on each side you can still recognize. Another popular test is to have the guests at a dinner party close their eyes and then ask them to recount the scene they just experienced. These tests indicate that beyond our focused attention, our capacity to perceive and retain visual information is severely limited, to the extent of "inattentional blindness."

I had an experience the other day that falls into this category. I encountered two people and immediately recognized one but not the other who was wearing an N95 mask. Thirty seconds later I looked at her again and of course I knew who she was! The mask didn't hide her identity to me all. I was sure I had looked at her initially but apparently my attention was not focused enough beyond simply seeing a masked person. Maybe this is a bit like the experiment where a gorilla walks through the scene and is unnoticed by a large percentage of viewers.
 
  • Like
  • Skeptical
Likes BillTre and jbriggs444
  • #18
JT Smith said:
Do you really think you are capable of perceiving 16 bits per pixel? Even for just one pixel that's hard to believe never mind thousands of them. A jpeg reduces the number of bits via compression without my noticing much difference, if any.

Here's what the authors of the paper had to say about visual processing:



I had an experience the other day that falls into this category. I encountered two people and immediately recognized one but not the other who was wearing an N95 mask. Thirty seconds later I looked at her again and of course I knew who she was! The mask didn't hide her identity to me all. I was sure I had looked at her initially but apparently my attention was not focused enough beyond simply seeing a masked person. Maybe this is a bit like the experiment where a gorilla walks through the scene and is unnoticed by a large percentage of viewers.
There are 26 letters in the alphabet, 52 if you count capitals. Plus 10 numeric digits. That's about 64 possible characters. That is a minimum of six bits.

If I show you a single character on a piece of paper, you must take at least 0.6 seconds simply to read the character? That's what you are saying? Processing the input must be separate from the input itself. As must preparing the output (the sound that that character represents).

At 10 bits per second, it would take you at least 1.8 seconds to read a single character and speak it.

My guess is you could do that in 0.1-0.2 seconds. Although, obviously, the physical movement of muscles and rendering of a sound by your lips and tongue is additional to any "thought". I would say there is also non-thinking reaction time.

The point about language being condensed and not needing as many characters as are written doesn't explain our ability to process single characters in less than a second.

The whole thing is nonsensical.
 
  • Like
Likes BillTre
  • #19
PS we can process characters in many fonts and sizes and colours. It can't possibly be six bits per character. If it were, how could be identify, the colour, size etc.? If you showed someone a huge captial, green A on a piece of paper, then they would almost instantly recognise that: and be able to say "large, green, upper-case A" in very little time.

If we could only think at 10 bits per second, that would take at least 5 seconds, not counting the time it takes to physically make the sounds.

The whole thing is totally absurd.
 
  • Like
Likes BillTre
  • #20
PPS this analysis hasn't even taken into account how difficult it is to recognise non-binary input. That is a highly non-trivial task. We are not just processing a neat, binary or hexadecimal input like a computer. We are processing fuzzy data, with all the processing power that requires.
 
  • #21
PeroK said:
The whole thing is nonsensical.
I do not agree that the whole thing is nonsensical.

One thing that is nonsensical is reducing the processing power of the brain to a single number and then using that number as if it correctly quantifies every aspect of that processing.

This is similar in concept to benchmarking a computer. More than just a single number is relevant. Memory size, memory latency, video, cache size, CPU type. Different applications will have different performance figures.
 
  • #22
JT Smith said:
Maybe this is a bit like the experiment where a gorilla walks through the scene and is unnoticed by a large percentage of viewers.
I like this observation. I have long felt that much of the human ability to get through every day life is based on the ability to ignore huge swaths of input data while recognizing a little bit of relevant input.

Like the male ability to ignore dirt or the sound of the wife asking for trash to be taken out.
 
Last edited:
  • #23
I found this explanation of the number

"To understand the concept of information, it is essential to differentiate it from that of data. Let's take an example. I have a friend who has just given birth. I send her a text message to ask her the sex of the newborn. In my vision of things, there is an equal chance that it will be a boy or a girl. Her response will therefore send me exactly one Shannon."

That assumes that I can only process "boy" or "girl" as an answer. Like a computer input screen, that rejects anything that isn't M/F for a given field. I may be expecting "boy" or "girl", but I could cope with any answer. For example, if they answered "7 lbs 3 ounces", I see immediately that they have given me the weight of the baby. How is that 1 bit of information?

And that's the whole argument undermined. A computer system generally requires input in a pre-defined format, where there is no flexibiity. If humans processed information like that, communication would be impossible. We'd have to agree interface specifications in advance of having a simple conversation.

I used to use this very idea to explain why interoperability of computer systems was so laborious and time-consuming. I used to point out that humans can exchange information with an almost unlimited flexibity of format; whereas, a computer interface is hard-wired. And that's where our information processing power goes - on the flexibility and capability of managing fuzziness.

And fuzziness is information, IMHO. The lack of fixed syntax and format must increase the information content. A hand-written word is not a simple set of well-defined characters, but contains far more information - which we can and do process. That it may not be useful information is another matter altogether.
 
Last edited:
  • Like
Likes BillTre
  • #24
... and that's why thought is not 10 bits per second. Because, thought includes processing the fuzzy information as well as the core information.
 
  • Like
Likes BillTre
  • #25
PeroK said:
There are 26 letters in the alphabet, 52 if you count capitals. Plus 10 numeric digits. That's about 64 possible characters. That is a minimum of six bits.

If I show you a single character on a piece of paper, you must take at least 0.6 seconds simply to read the character? That's what you are saying? Processing the input must be separate from the input itself. As must preparing the output (the sound that that character represents).

I'm not saying that. I'm trying not to simply dismiss out of hand the conclusions of this paper without having actually read it.

Counting bits is tricky. When it comes to letters we usually encounter them in context. The authors of the paper calculate that in the context of reading English each character is only 1 bit. If instead of a grammatical string of words you have an unbroken stream of random characters the rate at which one can absorb that is greatly reduced.

Still, there's a lot going on that this analysis glosses over. Maybe I just don't like being told I'm slow.
 
  • #26
JT Smith said:
Counting bits is tricky. When it comes to letters we usually encounter them in context. The authors of the paper calculate that in the context of reading English each character is only 1 bit.
I'm not sure about the relevance of that to the "speed of thought". An analogy would be if you ask a chess player (or a chess engine) to decide between two candidate moves. The player or engine takes a minute, say, to decide on Ne5 over Ng5. That's a binary question with a binary answer. The conclusion would be that a chess engine operates at 1 bit per minute? That makes no sense. Thought is about processing data to produce information.

Say you are presented with a page of text, about 500 words, which you read, analyse and form opinions about in two minutes. What does it matter that hypothetically that could be coded into 500 bits (9 bytes)? That's irrelevant. When we read a page of text, we may form many thoughts about it: its spelling and grammar, the style, whether we agree with it; we recognise quotations, refrences or allusions etc. That's thinking. If we are only transcribing random characters, then we are essentially no longer thinking; we are trying to imitate a facsimile machine. And perhaps my speed as a facsimile machine is 10 bits per second. But, that's not "thinking"!
JT Smith said:
If instead of a grammatical string of words you have an unbroken stream of random characters the rate at which one can absorb that is greatly reduced.
I agree. But there are over a hundred of characters that I can recognise instantly. The problem with a string of random characters, I suggest, is that we are trying a pattern recognition algorithm that doesn't work. We waste a lot of information processing time looking for what's not there. If, instead, each character was written separately so that we are not fooled into instinctively using a wasteful algorithm, then I could read random characters way faster than one per second. Have you ever taken an eye test? That's exactly what you do. The proof of this is also in reading telephone numbers and credit card numbers. The latter are separated into groups of four, I suggest, for precisely this reason. To avoid the visual confusion of too many numbers being written together.

Here's a quotation from one of the authors:

"In particular, our peripheral nervous system is capable of absorbing information from the environment at much higher rates, on the order of gigabits/s," the team writes. "This defines a paradox: The vast gulf between the tiny information throughput of human behavior, and the huge information inputs on which the behavior is based. This enormous ratio – about 100,000,000 – remains largely unexplained."

That shows that they are comparing two extremely different things. We can argue about the 10 bits per second, but what is it here that is actually puzzling or paradoxical? If we go back to the chess engine, is it puzzling that it assesses a billion possible combinations of moves, just to produce one move? Everyone recognises the assessment of the billion positions that constitutes the engine's ability. With humans its the information processing that is thought. Not the simple output in terms of a "yes" or "no".
 
Last edited:
  • Like
Likes BillTre
  • #27
The title of the article itself is very non-sciency: The unbearable slowness of being: Why do we live at 10 bits/s?
I can't take it seriously :(
 
  • Like
Likes BillTre, mcastillo356 and PeroK
  • #28
Hi, PF, don't know if it suits the thread, but in my language (Basque), to express that someone is inteligent, we use the words "bizkor, azkar...", which means fast, quick... But in my personal opinion speed of thought goes from almost zero to nobody knows. Could this article refer to some statistical average? Or might it refer to some... I don't have a clue. The only thing I agree with is the concept of bit. If I want to talk about input or output of information, no matter the context, bit is the keyword.
Greetings
Marcos, Speedy Gonzales for friends:oldlaugh:
 
  • Like
Likes PeroK

Similar threads

Replies
5
Views
5K
Replies
2
Views
1K
Replies
98
Views
5K
Replies
1
Views
1K
Replies
13
Views
2K
Replies
4
Views
1K
Replies
35
Views
6K
Back
Top