Range and resolution of synaptic connections

In summary: I cram in here before it blows up?In summary, researchers are still trying to determine the Shannon information limit for the human brain.
  • #1
schip666!
595
0
I've been around and around on the net, and had two neuro-physiologists repeatedly dodge the question, so I open it to the wider population...

When one speaks of learning one uses the phrase "adjusting synaptic weights", meaning (I presume) changing the strength of a connection between two neurons. However I have not been able to find a reasonable (or simply stated) estimate of how much these "weights" can be changed. I usually get sidetracked into discussion of whether it's timing or strength that is being changed, or how insects are different from humans, or some other seemingly more interesting topic, and never get to the range and resolution.

I wanted to know this in order to make a _very_ rough estimate of the Shannon Information content of the human brain for comparison -- if there could be such -- to some little robot cars that I'm building.

So does anyone know offhand how many bits of information are encoded in these putative "synaptic weights"? Or, alternately, can you explain to a (somewhat) sophisticated layman why that is a stupid question to ask?
 
Biology news on Phys.org
  • #2
It's not a stupid question at all. I haven't been able to find a definitive answer either. Below are two papers. One shows some measured synaptic weights, which gives you an idea of their range. Another is an artificial neural network. In artificial neural networks I have seen, some use binary synaptic weights, which is easiest to design, but probably not very representative of biological networks. Others use some number of bits in either a digital circuit or an Analog-Digital converter. I have seen 8 bits used for this, but can't find that paper. The one below uses 4 bits. If you're just trying to make a rough estimate, 8 bits (256 levels) is probably a good estimate.
 

Attachments

  • Syn_Weights.pdf
    585 KB · Views: 484
  • ANN.pdf
    113 KB · Views: 303
  • #3
Oh dangit, now I have to perform work. Thanks for the papers. If I can understand that first one and follow some of the refs I might get a clue. Strange that this isn't on the tip of every researcher's tongue, no? 8 bits is such a convenient value that I may just use it with attribution to you...

I also had some difficulty getting a count of inputs and outputs, finally came up with 150-175M inputs -- counting all the rods and cones, significantly less (1-25M) when just counting the eyes as units -- and about 800 muscles for output.
 
  • #4
Definitely not a stupid question, just a really tough one. I'd be interested in seeing your results. The reality is that a synapse functions based on the concentration of the relevant neurotransmitter, and that isn't exact or binary. You'll be working with approximations, but only because that's the best you can possibly work with.

The kicker is, this route of inquiry may not yield the result you're looking for, the Shannon Information limit for a given brain (human or otherwise). You can't simply look at the strength or number of connections in a given slice (MRI slice or real) and say, "aha, this represents a terabyte capacity!". That is still an unknown, and therefore not stupid to ask at all.
 
  • #5
The Shannon information limit of the human brain is a direct function of the number of possible "brain states". How do you plan to evaluate the number of possible brain states (and the probability of each state assuming you can even define the states)? The processing capacity of the brain is thought to be around [tex]10^{16}[/tex] Hertz. How would you use this information even if you could quantify this "information limit"? The effective processing speed seems to be limited by certain structural features of brain architecture.

http://www.psy.vanderbilt.edu/faculty/marois/Publications/Marois_Ivanoff-2005.pdf
 
Last edited:
  • #6
schip666! said:
and about 800 muscles for output.

Your output count is hugely low. Muscles don't fire as a unit - each muscle fiber can be fired individually. This is how you control how strongly you pull on something.
 
  • #7
SW VandeCarr said:
The Shannon information limit of the human brain is a direct function of the number of possible "brain states". How do you plan to evaluate the number of possible brain states (and the probability of each state assuming you can even define the states)? The processing capacity of the brain is thought to be around [tex]10^{16}[/tex] Hertz. How would you use this information even if you could quantify this "information limit"? The effective processing speed seems to be limited by certain structural features of brain architecture.

http://www.psy.vanderbilt.edu/faculty/marois/Publications/Marois_Ivanoff-2005.pdf

...And all of this is subject to change if proponents of the human brain as a quantum computer find evidence to support that claim.
 
  • #8
I think the main point of my little exercise here is that the neuro folks don't seem to think the same way as the computer folks. As a software geek (and machinist) I think in terms of speeds-and-feeds: How Much input, output, and processing in-between. Neuro-scientists don't seem to go about it that way, perhaps because we don't know enough yet. Information capacity would seem to be a useful benchmark, presuming it can be calculated with any accuracy.

The Shannon information limit of the human brain is a direct function of the number of possible "brain states". How do you plan to evaluate the number of possible brain states (and the probability of each state assuming you can even define the states)? The processing capacity of the brain is thought to be around LaTeX Code: 10^{16} Hertz. How would you use this information even if you could quantify this "information limit"? The effective processing speed seems to be limited by certain structural features of brain architecture.

Not sure what Hertz has to do with processing capacity, but I came up with 5.6 petaBit of state and 35 petaFlop (which is, checking back, 10^16) of processing. Showing my work at: "[URL
[/URL]

This is what one would call a _way_rough_ estimate, perhaps within a few orders of magnitude of so-called reality... A lot more detail needs to go into the picture before a real range could be provided. One consideration is that there is a lot of redundancy in mammal brains -- one researcher suggested that insects would be a better comparison point as they have very little redundancy. This is interesting because the papers I've seen so far are still stabbing in the dark on processing estimates because they are stuck on size and scaling comparisons -- that could just be the papers I've found so far though.

Shannon Information _is_ the number of states a system can inhabit. It's an upper bound that says nothing about complexity and "meaning". That's a question for the test...

Your output count is hugely low. Muscles don't fire as a unit - each muscle fiber can be fired individually. This is how you control how strongly you pull on something.
I agree. But, just as with the original question, I couldn't find _any_ data for the number of motor-neurons themselves. Someone suggested using muscle count, but are muscles the only outputs? Depends on how you count muscles I guess... I've also got counts for macaques and other higher mammals that are 50% below the 800 mark.

...And all of this is subject to change if proponents of the human brain as a quantum computer find evidence to support that claim.
I'm not so sure that quantum weirdness would change the picture, just the mechanism. But I have it on the un-attributable authority of a Nobel physicist that Penrose and the quantum tubule folks are "cranks". This doesn't mean that QM neural effects are not possible however... I just want to keep it simple for my simple brain.
 
Last edited by a moderator:
  • #9
schip666! said:
I think the main point of my little exercise here is that the neuro folks don't seem to think the same way as the computer folks. As a software geek (and machinist) I think in terms of speeds-and-feeds: How Much input, output, and processing in-between. Neuro-scientists don't seem to go about it that way, perhaps because we don't know enough yet. Information capacity would seem to be a useful benchmark, presuming it can be calculated with any accuracy.



Not sure what Hertz has to do with processing capacity, but I came up with 5.6 petaBit of state and 35 petaFlop (which is, checking back, 10^16) of processing. Showing my work at: "[URL
[/URL]

This is what one would call a _way_rough_ estimate, perhaps within a few orders of magnitude of so-called reality... A lot more detail needs to go into the picture before a real range could be provided. One consideration is that there is a lot of redundancy in mammal brains -- one researcher suggested that insects would be a better comparison point as they have very little redundancy. This is interesting because the papers I've seen so far are still stabbing in the dark on processing estimates because they are stuck on size and scaling comparisons -- that could just be the papers I've found so far though.

Shannon Information _is_ the number of states a system can inhabit. It's an upper bound that says nothing about complexity and "meaning". That's a question for the test...


I agree. But, just as with the original question, I couldn't find _any_ data for the number of motor-neurons themselves. Someone suggested using muscle count, but are muscles the only outputs? Depends on how you count muscles I guess... I've also got counts for macaques and other higher mammals that are 50% below the 800 mark.


I'm not so sure that quantum weirdness would change the picture, just the mechanism. But I have it on the un-attributable authority of a Nobel physicist that Penrose and the quantum tubule folks are "cranks". This doesn't mean that QM neural effects are not possible however... I just want to keep it simple for my simple brain.

I understand your point, and I'm not advocating the notion of quantum microtubules being the seat of consciousness, but as we find quantum behavior in such biological processes as photosynthesis, we have to consider where this kind of thing ends or if it does at all. It's not really a shortcoming of neurobiology that you've asked a question that can only be guestimated, its just the state of the science like any other. I don't know that your output is meaningful, but who knows? As I said, it's just a mystery right now that is beyond probing with current imaging tools and other means of examination.
 
Last edited by a moderator:

1. What is the difference between range and resolution of synaptic connections?

Range refers to the distance over which a synaptic connection can transmit signals, while resolution refers to the ability of a synaptic connection to accurately transmit and receive signals. In other words, range is the physical distance between neurons and resolution is the precision with which they communicate.

2. How do the range and resolution of synaptic connections affect neuronal communication?

The range and resolution of synaptic connections play a crucial role in neuronal communication. A longer range allows for communication between distant neurons, while a higher resolution enables precise and efficient communication between nearby neurons. Both are important for the proper functioning of the nervous system.

3. What factors determine the range and resolution of synaptic connections?

The range and resolution of synaptic connections are determined by several factors, including the type of neurotransmitter used, the strength of the connection, and the physical distance between neurons. Additionally, the presence of myelin sheaths on axons can increase the range and resolution of synaptic connections.

4. Can the range and resolution of synaptic connections change?

Yes, the range and resolution of synaptic connections can change over time. This can occur through processes such as synaptic plasticity, where the strength and effectiveness of a connection can be altered based on the frequency and timing of signals. Additionally, the growth and development of new synapses can also change the range and resolution of synaptic connections.

5. How do advances in technology impact our understanding of range and resolution of synaptic connections?

Advances in technology, such as electron microscopy and optogenetics, have allowed scientists to better visualize and manipulate synaptic connections, leading to a deeper understanding of their range and resolution. These technologies have also allowed for the discovery of new types of synaptic connections and have opened up avenues for further research in this field.

Similar threads

  • Biology and Medical
Replies
2
Views
11K
  • General Discussion
3
Replies
71
Views
14K
  • Special and General Relativity
Replies
13
Views
2K
  • Other Physics Topics
2
Replies
48
Views
8K
Back
Top