Range and resolution of synaptic connections

  • Context: Medical 
  • Thread starter Thread starter schip666!
  • Start date Start date
  • Tags Tags
    Range Resolution
Click For Summary

Discussion Overview

The discussion revolves around the concept of synaptic weights in the context of learning and information encoding in the human brain. Participants explore the range and resolution of changes in synaptic weights, the implications for estimating the Shannon Information content of the brain, and comparisons with artificial neural networks. The scope includes theoretical considerations, exploratory reasoning, and technical challenges related to quantifying brain states and processing capacity.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions the extent to which synaptic weights can be adjusted and seeks estimates for their range.
  • Another participant provides references to papers that measure synaptic weights and suggests that 8 bits may be a reasonable estimate for encoding information in synaptic weights.
  • Concerns are raised about the complexity of synapse function, which is influenced by neurotransmitter concentration rather than being strictly binary.
  • Participants discuss the Shannon information limit of the brain, noting that it is related to the number of possible brain states and the challenges in defining and quantifying these states.
  • One participant mentions the processing capacity of the brain, estimated at around 10^{16} Hertz, and questions how this relates to information limits.
  • There is a discussion about the redundancy in mammalian brains compared to insects, suggesting that insects may provide a clearer comparison for processing estimates.
  • Some participants express uncertainty about the relevance of quantum mechanics to brain function, with differing opinions on the implications of quantum theories for understanding neural processes.
  • One participant reflects on the differences in thinking between neuro-scientists and computer scientists regarding information capacity and processing metrics.

Areas of Agreement / Disagreement

Participants generally agree that the question of synaptic weights and information encoding is complex and not straightforward. There are multiple competing views regarding the quantification of brain states, the relevance of processing speed, and the implications of redundancy in brain architecture. The discussion remains unresolved with respect to definitive answers or consensus on these topics.

Contextual Notes

Limitations include the difficulty in accurately defining and quantifying brain states, the dependence on various assumptions regarding synaptic function, and the unresolved nature of how to relate processing capacity to information limits. The discussion also highlights the challenges in comparing biological and artificial systems.

schip666!
Messages
594
Reaction score
0
I've been around and around on the net, and had two neuro-physiologists repeatedly dodge the question, so I open it to the wider population...

When one speaks of learning one uses the phrase "adjusting synaptic weights", meaning (I presume) changing the strength of a connection between two neurons. However I have not been able to find a reasonable (or simply stated) estimate of how much these "weights" can be changed. I usually get sidetracked into discussion of whether it's timing or strength that is being changed, or how insects are different from humans, or some other seemingly more interesting topic, and never get to the range and resolution.

I wanted to know this in order to make a _very_ rough estimate of the Shannon Information content of the human brain for comparison -- if there could be such -- to some little robot cars that I'm building.

So does anyone know offhand how many bits of information are encoded in these putative "synaptic weights"? Or, alternately, can you explain to a (somewhat) sophisticated layman why that is a stupid question to ask?
 
Biology news on Phys.org
It's not a stupid question at all. I haven't been able to find a definitive answer either. Below are two papers. One shows some measured synaptic weights, which gives you an idea of their range. Another is an artificial neural network. In artificial neural networks I have seen, some use binary synaptic weights, which is easiest to design, but probably not very representative of biological networks. Others use some number of bits in either a digital circuit or an Analog-Digital converter. I have seen 8 bits used for this, but can't find that paper. The one below uses 4 bits. If you're just trying to make a rough estimate, 8 bits (256 levels) is probably a good estimate.
 

Attachments

Oh dangit, now I have to perform work. Thanks for the papers. If I can understand that first one and follow some of the refs I might get a clue. Strange that this isn't on the tip of every researcher's tongue, no? 8 bits is such a convenient value that I may just use it with attribution to you...

I also had some difficulty getting a count of inputs and outputs, finally came up with 150-175M inputs -- counting all the rods and cones, significantly less (1-25M) when just counting the eyes as units -- and about 800 muscles for output.
 
Definitely not a stupid question, just a really tough one. I'd be interested in seeing your results. The reality is that a synapse functions based on the concentration of the relevant neurotransmitter, and that isn't exact or binary. You'll be working with approximations, but only because that's the best you can possibly work with.

The kicker is, this route of inquiry may not yield the result you're looking for, the Shannon Information limit for a given brain (human or otherwise). You can't simply look at the strength or number of connections in a given slice (MRI slice or real) and say, "aha, this represents a terabyte capacity!". That is still an unknown, and therefore not stupid to ask at all.
 
The Shannon information limit of the human brain is a direct function of the number of possible "brain states". How do you plan to evaluate the number of possible brain states (and the probability of each state assuming you can even define the states)? The processing capacity of the brain is thought to be around 10^{16} Hertz. How would you use this information even if you could quantify this "information limit"? The effective processing speed seems to be limited by certain structural features of brain architecture.

http://www.psy.vanderbilt.edu/faculty/marois/Publications/Marois_Ivanoff-2005.pdf
 
Last edited:
schip666! said:
and about 800 muscles for output.

Your output count is hugely low. Muscles don't fire as a unit - each muscle fiber can be fired individually. This is how you control how strongly you pull on something.
 
SW VandeCarr said:
The Shannon information limit of the human brain is a direct function of the number of possible "brain states". How do you plan to evaluate the number of possible brain states (and the probability of each state assuming you can even define the states)? The processing capacity of the brain is thought to be around 10^{16} Hertz. How would you use this information even if you could quantify this "information limit"? The effective processing speed seems to be limited by certain structural features of brain architecture.

http://www.psy.vanderbilt.edu/faculty/marois/Publications/Marois_Ivanoff-2005.pdf

...And all of this is subject to change if proponents of the human brain as a quantum computer find evidence to support that claim.
 
I think the main point of my little exercise here is that the neuro folks don't seem to think the same way as the computer folks. As a software geek (and machinist) I think in terms of speeds-and-feeds: How Much input, output, and processing in-between. Neuro-scientists don't seem to go about it that way, perhaps because we don't know enough yet. Information capacity would seem to be a useful benchmark, presuming it can be calculated with any accuracy.

The Shannon information limit of the human brain is a direct function of the number of possible "brain states". How do you plan to evaluate the number of possible brain states (and the probability of each state assuming you can even define the states)? The processing capacity of the brain is thought to be around LaTeX Code: 10^{16} Hertz. How would you use this information even if you could quantify this "information limit"? The effective processing speed seems to be limited by certain structural features of brain architecture.

Not sure what Hertz has to do with processing capacity, but I came up with 5.6 petaBit of state and 35 petaFlop (which is, checking back, 10^16) of processing. Showing my work at: "[URL
[/URL]

This is what one would call a _way_rough_ estimate, perhaps within a few orders of magnitude of so-called reality... A lot more detail needs to go into the picture before a real range could be provided. One consideration is that there is a lot of redundancy in mammal brains -- one researcher suggested that insects would be a better comparison point as they have very little redundancy. This is interesting because the papers I've seen so far are still stabbing in the dark on processing estimates because they are stuck on size and scaling comparisons -- that could just be the papers I've found so far though.

Shannon Information _is_ the number of states a system can inhabit. It's an upper bound that says nothing about complexity and "meaning". That's a question for the test...

Your output count is hugely low. Muscles don't fire as a unit - each muscle fiber can be fired individually. This is how you control how strongly you pull on something.
I agree. But, just as with the original question, I couldn't find _any_ data for the number of motor-neurons themselves. Someone suggested using muscle count, but are muscles the only outputs? Depends on how you count muscles I guess... I've also got counts for macaques and other higher mammals that are 50% below the 800 mark.

...And all of this is subject to change if proponents of the human brain as a quantum computer find evidence to support that claim.
I'm not so sure that quantum weirdness would change the picture, just the mechanism. But I have it on the un-attributable authority of a Nobel physicist that Penrose and the quantum tubule folks are "cranks". This doesn't mean that QM neural effects are not possible however... I just want to keep it simple for my simple brain.
 
Last edited by a moderator:
schip666! said:
I think the main point of my little exercise here is that the neuro folks don't seem to think the same way as the computer folks. As a software geek (and machinist) I think in terms of speeds-and-feeds: How Much input, output, and processing in-between. Neuro-scientists don't seem to go about it that way, perhaps because we don't know enough yet. Information capacity would seem to be a useful benchmark, presuming it can be calculated with any accuracy.



Not sure what Hertz has to do with processing capacity, but I came up with 5.6 petaBit of state and 35 petaFlop (which is, checking back, 10^16) of processing. Showing my work at: "[URL
[/URL]

This is what one would call a _way_rough_ estimate, perhaps within a few orders of magnitude of so-called reality... A lot more detail needs to go into the picture before a real range could be provided. One consideration is that there is a lot of redundancy in mammal brains -- one researcher suggested that insects would be a better comparison point as they have very little redundancy. This is interesting because the papers I've seen so far are still stabbing in the dark on processing estimates because they are stuck on size and scaling comparisons -- that could just be the papers I've found so far though.

Shannon Information _is_ the number of states a system can inhabit. It's an upper bound that says nothing about complexity and "meaning". That's a question for the test...


I agree. But, just as with the original question, I couldn't find _any_ data for the number of motor-neurons themselves. Someone suggested using muscle count, but are muscles the only outputs? Depends on how you count muscles I guess... I've also got counts for macaques and other higher mammals that are 50% below the 800 mark.


I'm not so sure that quantum weirdness would change the picture, just the mechanism. But I have it on the un-attributable authority of a Nobel physicist that Penrose and the quantum tubule folks are "cranks". This doesn't mean that QM neural effects are not possible however... I just want to keep it simple for my simple brain.

I understand your point, and I'm not advocating the notion of quantum microtubules being the seat of consciousness, but as we find quantum behavior in such biological processes as photosynthesis, we have to consider where this kind of thing ends or if it does at all. It's not really a shortcoming of neurobiology that you've asked a question that can only be guestimated, its just the state of the science like any other. I don't know that your output is meaningful, but who knows? As I said, it's just a mystery right now that is beyond probing with current imaging tools and other means of examination.
 
Last edited by a moderator:

Similar threads

Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
12K
  • · Replies 71 ·
3
Replies
71
Views
17K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 48 ·
2
Replies
48
Views
11K