What are the scientific tests for measuring brain memory capacity?

In summary, Phyzguy says that because we don't know how the brain stores memories, we can make an estimate of how much memory it takes to store all synapses. However, because there are many other processes going on in the brain, this estimate is only a rough one.
  • #1
MathematicalPhysicist
Gold Member
4,699
371
What scientific tests can be made to test what are my (or anyone else) memory capacity?
And how do you measure our brain's memory capacity, what measure units do you use? is it like in PC with RAMs or SDRAM or what not?
 
Biology news on Phys.org
  • #2
MathematicalPhysicist said:
What scientific tests can be made to test what are my (or anyone else) memory capacity?
And how do you measure our brain's memory capacity, what measure units do you use? is it like in PC with RAMs or SDRAM or what not?

I'm afraid that we do not have a comprehensive view of how the brain stores memories. Commonly we talk about http://en.wikipedia.org/wiki/Working_memory" but without an understanding of the physical mechanism we can't make a good comparison to our technological forms of memory.
 
Last edited by a moderator:
  • #3
While we don't know for sure, it seems likely that memories in the brain are stored in the changing synaptic thresholds at the synapses which connect the neurons together. If we accept this, we can make a rough estimate of the brain's memory capacity. It appears there are ~ 10^11 neurons in the brain, and each neuron has ~10^4 synaptic connections. Let's assume that each synaptic threshold is an analog variable with at most 10 bit accuracy (1024 discrete thresholds). Then the total memory capacity would be 10^16 bits, which is 1000 Terabytes. This is probably within 1-2 orders of magnitude.
 
  • #4
phyzguy said:
While we don't know for sure, it seems likely that memories in the brain are stored in the changing synaptic thresholds at the synapses which connect the neurons together. If we accept this, we can make a rough estimate of the brain's memory capacity. It appears there are ~ 10^11 neurons in the brain, and each neuron has ~10^4 synaptic connections. Let's assume that each synaptic threshold is an analog variable with at most 10 bit accuracy (1024 discrete thresholds). Then the total memory capacity would be 10^16 bits, which is 1000 Terabytes. This is probably within 1-2 orders of magnitude.

Even if we were to assume this was true (and it's an assumption to say that synapses represents a set number of bits and to suggest that they alone form memories) it does not show the memory capacity of the brain, it shows how much memory it takes to record all synapses.

To use an example we still have no idea how vision is converted into memory, it is entirely too simplistic and based on no good or relevant data to suggest things like "the brain has X memory, the eye produces X bits per second therefore the brain can hold Y seconds of vision" as some people I have met have suggested. There is no good way to approach this question at the moment, synapses are not computer components and to even begin to understand how to answer this question we are going to have to have not only a far better understanding of http://en.wikipedia.org/wiki/Memory_consolidation"
 
Last edited by a moderator:
  • #5
Just because we don't know how the memories are processed or encoded doesn't preclude us for estimating how much storage capacity is available. I can be totally ignorant of the algorithms my computer uses to store information on its hard drive, but no matter what algorithms it uses, it only has a certain amount of storage capacity available. As I said, if memories are stored as synaptic weights (by no means certain, but a reasonable assumption), I maintain my estimate is a reasonable one.
 
  • #6
phyzguy said:
if memories are stored as synaptic weights (by no means certain, but a reasonable assumption), I maintain my estimate is a reasonable one.

As I said this is too simplistic an interpretation. There are many other process occurring in the brain that effect neural networks and many other structures other than synapses. This is why we cannot properly tackle this problem because we don't know what it is about the brain that creates consciousness and enables memory.

Potentially you could be entirely correct but I am unprepared to start working out or accepting numbers on the basis of premises that have not met their burden of proof.
 
  • #7
phyzguy said:
Just because we don't know how the memories are processed or encoded doesn't preclude us for estimating how much storage capacity is available. I can be totally ignorant of the algorithms my computer uses to store information on its hard drive, but no matter what algorithms it uses, it only has a certain amount of storage capacity available. As I said, if memories are stored as synaptic weights (by no means certain, but a reasonable assumption), I maintain my estimate is a reasonable one.
Phyzguy, we don't allow personal opinions as scientific answers. Since you claim your personal theory is an acceptable answer, then you must post the scientific research published in an accepted peer reviewed journal that shows your information has been accepted by the scientific community.
 
  • #8
ryan_m_b said:
it does not show the memory capacity of the brain, it shows how much memory it takes to record all synapses.

I agree with that. It raises the interesting question of whether anyone has been able to create a mathematical model of richly connected neural network where the weight at each synapse corresponded to a memory. To answer this, one would have to distinguish between a "memory" vs a "property".

For example, in a computer memory chip, it would require many bits of information to describe how it was organized as a physical object if you started at the level of detail that gave the approximate positions of its atoms. But that sort of detail is what I think of as a "properties" of the chip. Once we understand how the chip implements memory (from the perspective of what the computer user calls memory) then we focus on a different level of detail to estimate it.

In animals, there is the problem of distinguishing between memory and adaptive behavior. For example, in the Pavlovian type of conditioning animals "learn" certain behavior through experiences, but this doesn't mean they have much memory of those experiences. Does a mechanic who "remembers" how to take apart an engine do this only by recalling a collection of facts, or does he do it mainly by exercising certain patterns of behavior? As another example, which fact has more information: your maternal gradmother's middle name or the fact that matrix multiplication is associative?
 
  • #9
Evo said:
Phyzguy, we don't allow personal opinions as scientific answers.

We don't?
I do.
 
  • #10
ryan_m_b said:
To use an example we still have no idea how vision is converted into memory, it is entirely too simplistic and based on no good or relevant data to suggest things like "the brain has X memory, the eye produces X bits per second therefore the brain can hold Y seconds of vision" as some people I have met have suggested.
Perhaps there only needs to be a few tens of bits to record a memory. An ordering of concepts that are stored in special longterm memory.

Like a puzzle, it stores which bricks you need, but not the shape and size of the brick, and the blanks get filled in by an approximation akin to how your visions sometimes tricks you because the brain is filling in the gaps.
 
  • #11
Hells said:
Perhaps there only needs to be a few tens of bits to record a memory. An ordering of concepts that are stored in special longterm memory.

Like a puzzle, it stores which bricks you need, but not the shape and size of the brick, and the blanks get filled in by an approximation akin to how your visions sometimes tricks you because the brain is filling in the gaps.

Perhaps though I think a few tens of bits is too low. It could also be that as our brain remembers mnemonically new memories don't take up much memory because they are linked in with old.

However thanks to the limitations in our knowledge described above it's impossible to have a meaningful conversation to address the OP's question. Anything else is just speculation.
 
  • #12
Stephen Tashi said:
In animals, there is the problem of distinguishing between memory and adaptive behavior. For example, in the Pavlovian type of conditioning animals "learn" certain behavior through experiences, but this doesn't mean they have much memory of those experiences. Does a mechanic who "remembers" how to take apart an engine do this only by recalling a collection of facts, or does he do it mainly by exercising certain patterns of behavior?

Well the mechanic does rely on information stored in his brain, albeit of a different kind. What you call 'patterns of behaviour' is also known as Muscle memory.

please clarify if that is not what you meant.
 
  • #13
Evo said:
Phyzguy, we don't allow personal opinions as scientific answers. Since you claim your personal theory is an acceptable answer, then you must post the scientific research published in an accepted peer reviewed journal that shows your information has been accepted by the scientific community.

I understand and agree with the rules. This is not my personal opinion, it is a widely held view in neural research, since the pioneering work of Hebb in 1949. Here are two peer-reviewed articles, but a search will find many others.

(1) Lisman, John, Proc. Natl. Acad. Sci. USA
Vol. 86, pp. 9574-9578, December 1989
Neurobiology, Department of Biology, Brandeis University, Waltham, MA 02254

Here's a quote from the first paragraph,

"In many types of neural network models, learning occurs by
the bidirectional modification of synaptic weights according
to simple activity-dependent rules that resemble the Hebb
and anti-Hebb rules described below. Such networks can
store multiple memories (1), develop selectivity for input
patterns (2), find optimum solutions (3), and organize topo-
graphic maps (4). These theoretical results suggest that
understanding how synaptic weights are stored and modified
is important for understanding brain function in general, and
learning and memory in particular."

(2) Brunel, Nicolas, et.al., Neuron, Vol. 43, 745–757, September 2, 2004, Neurophysique et Physiologie, Universite Rene Descartes, 45 rue des Saints Peres, 75270 Paris, France.


Here's a quote from the first paragraph,

"Distributed changes of synaptic efficacy are thought to
underlie much of the learning and memory occurring
in the brain (Hebb, 1949). The distribution of synaptic
weights should thus be related to what has been learned
and the manner of its learning. However, few studies
have attempted to exploit this link, despite measure-
ments of synaptic weight distributions becoming avail-
able for an increasing number of connections."
 
  • #14
phyzguy said:
I understand and agree with the rules. This is not my personal opinion, it is a widely held view in neural research, since the pioneering work of Hebb in 1949. Here are two peer-reviewed articles, but a search will find many others.

(1) Lisman, John, Proc. Natl. Acad. Sci. USA
Vol. 86, pp. 9574-9578, December 1989
Neurobiology, Department of Biology, Brandeis University, Waltham, MA 02254

Here's a quote from the first paragraph,

"In many types of neural network models, learning occurs by
the bidirectional modification of synaptic weights according
to simple activity-dependent rules that resemble the Hebb
and anti-Hebb rules described below. Such networks can
store multiple memories (1), develop selectivity for input
patterns (2), find optimum solutions (3), and organize topo-
graphic maps (4). These theoretical results suggest that
understanding how synaptic weights are stored and modified
is important for understanding brain function in general, and
learning and memory in particular."

(2) Brunel, Nicolas, et.al., Neuron, Vol. 43, 745–757, September 2, 2004, Neurophysique et Physiologie, Universite Rene Descartes, 45 rue des Saints Peres, 75270 Paris, France.


Here's a quote from the first paragraph,

"Distributed changes of synaptic efficacy are thought to
underlie much of the learning and memory occurring
in the brain (Hebb, 1949). The distribution of synaptic
weights should thus be related to what has been learned
and the manner of its learning. However, few studies
have attempted to exploit this link, despite measure-
ments of synaptic weight distributions becoming avail-
able for an increasing number of connections."
This does not support you.
 
  • #15
Evo said:
This does not support you.

Can you kindly explain why not? My original statement was that, if memory is stored as synaptic weights, then I presented a possible calculation of how much memory can be stored in the brain. I presented two papers that clearly state that synaptic weights are thought to relate to memory storage in the brain. Quoting again, 'Distributed changes of synaptic efficacy are thought to underlie much of the learning and memory occurring in the brain (Hebb, 1949).' What is it you are objecting to? Is it the calculation of total synaptic weights? Or is it the thesis that synaptic weights are thought to be the location of memory storage?
 
  • #16
phyzguy said:
Can you kindly explain why not? My original statement was that, if memory is stored as synaptic weights, then I presented a possible calculation of how much memory can be stored in the brain. I presented two papers that clearly state that synaptic weights are thought to relate to memory storage in the brain. Quoting again, 'Distributed changes of synaptic efficacy are thought to underlie much of the learning and memory occurring in the brain (Hebb, 1949).' What is it you are objecting to? Is it the calculation of total synaptic weights? Or is it the thesis that synaptic weights are thought to be the location of memory storage?
Point out where they specifcally showed that memory capacity in the brain can be calculated, as you claimed. I don't see it.
 
  • #17
"synaptic weights" are tip of the ice berg. Firstly, they're not fixed quantities and they don't have fixed states (ltp and ltd are examples of this).

There are several modulatory aspects to neurons on the molecular level that cause slight different protein conformations that produce slightly different micro results that can have significant macro consequences (this is largely the focus of pharmaceuticals nowadays: target-specific allosteric modulation)

There are also diffusive information flows, generally laterally across networks that act as analog filters, canceling asynchronous signals across a sensory organism.

Eventually, you have to discuss Markov states when considering receptor kinetics.

All of this seems to strongly refute a digital view.
 
  • #18
I admit that my calculation of potential memory capacity was speculative, although I think it was well founded. Since it appears I have run afoul of the rules, I withdraw my estimate. Apologies if I muddied the water.
 

What is brain memory capacity?

Brain memory capacity refers to the amount of information that can be stored and retrieved by the brain. It is the total number of memories that can be formed and retained by an individual.

Is there a limit to brain memory capacity?

There is no definitive answer to this question. Some studies suggest that the human brain can store up to 2.5 petabytes of information, while others argue that the capacity is virtually unlimited. It is also important to note that the ability to retrieve memories plays a significant role in determining the brain's memory capacity.

How does the brain store and retrieve memories?

The storage and retrieval of memories involve complex processes in the brain, including the formation of new neural connections and strengthening of existing ones. This process, known as neuroplasticity, allows the brain to adapt and learn from new experiences.

Can brain memory capacity be improved?

Yes, brain memory capacity can be improved through various techniques such as practicing mnemonic devices, engaging in regular physical exercise, and getting enough sleep. These activities can help enhance memory formation and retention in the brain.

What factors can affect brain memory capacity?

Several factors can impact brain memory capacity, including age, genetics, physical and mental health, and lifestyle. As we age, our brain's ability to form and retrieve memories may decline due to natural changes in the brain. Certain medical conditions, such as Alzheimer's disease, can also affect memory capacity.

Similar threads

  • Biology and Medical
Replies
13
Views
2K
Replies
6
Views
1K
Replies
9
Views
2K
Replies
1
Views
2K
  • Biology and Medical
Replies
8
Views
4K
Replies
4
Views
2K
Replies
2
Views
955
  • Biology and Medical
Replies
7
Views
2K
  • Biology and Medical
Replies
6
Views
1K
  • Biology and Medical
Replies
3
Views
756
Back
Top