# Brain memomery capacity.

1. Jun 25, 2011

### MathematicalPhysicist

What scientific tests can be made to test what are my (or anyone else) memory capacity?
And how do you measure our brain's memory capacity, what measure units do you use? is it like in PC with RAMs or SDRAM or what not?

2. Jun 25, 2011

### Ryan_m_b

Staff Emeritus
I'm afraid that we do not have a comprehensive view of how the brain stores memories. Commonly we talk about http://en.wikipedia.org/wiki/Working_memory" [Broken] but without an understanding of the physical mechanism we can't make a good comparison to our technological forms of memory.

Last edited by a moderator: May 5, 2017
3. Jun 26, 2011

### phyzguy

While we don't know for sure, it seems likely that memories in the brain are stored in the changing synaptic thresholds at the synapses which connect the neurons together. If we accept this, we can make a rough estimate of the brain's memory capacity. It appears there are ~ 10^11 neurons in the brain, and each neuron has ~10^4 synaptic connections. Let's assume that each synaptic threshold is an analog variable with at most 10 bit accuracy (1024 discrete thresholds). Then the total memory capacity would be 10^16 bits, which is 1000 Terabytes. This is probably within 1-2 orders of magnitude.

4. Jun 26, 2011

### Ryan_m_b

Staff Emeritus
Even if we were to assume this was true (and it's an assumption to say that synapses represents a set number of bits and to suggest that they alone form memories) it does not show the memory capacity of the brain, it shows how much memory it takes to record all synapses.

To use an example we still have no idea how vision is converted into memory, it is entirely too simplistic and based on no good or relevant data to suggest things like "the brain has X memory, the eye produces X bits per second therefore the brain can hold Y seconds of vision" as some people I have met have suggested. There is no good way to approach this question at the moment, synapses are not computer components and to even begin to understand how to answer this question we are going to have to have not only a far better understanding of http://en.wikipedia.org/wiki/Memory_consolidation" [Broken]

Last edited by a moderator: May 5, 2017
5. Jun 26, 2011

### phyzguy

Just because we don't know how the memories are processed or encoded doesn't preclude us for estimating how much storage capacity is available. I can be totally ignorant of the algorithms my computer uses to store information on its hard drive, but no matter what algorithms it uses, it only has a certain amount of storage capacity available. As I said, if memories are stored as synaptic weights (by no means certain, but a reasonable assumption), I maintain my estimate is a reasonable one.

6. Jun 26, 2011

### Ryan_m_b

Staff Emeritus
As I said this is too simplistic an interpretation. There are many other process occurring in the brain that effect neural networks and many other structures other than synapses. This is why we cannot properly tackle this problem because we don't know what it is about the brain that creates consciousness and enables memory.

Potentially you could be entirely correct but I am unprepared to start working out or accepting numbers on the basis of premises that have not met their burden of proof.

7. Jun 26, 2011

### Staff: Mentor

Phyzguy, we don't allow personal opinions as scientific answers. Since you claim your personal theory is an acceptable answer, then you must post the scientific research published in an accepted peer reviewed journal that shows your information has been accepted by the scientific community.

8. Jun 26, 2011

### Stephen Tashi

I agree with that. It raises the interesting question of whether anyone has been able to create a mathematical model of richly connected neural network where the weight at each synapse corresponded to a memory. To answer this, one would have to distinguish between a "memory" vs a "property".

For example, in a computer memory chip, it would require many bits of information to describe how it was organized as a physical object if you started at the level of detail that gave the approximate positions of its atoms. But that sort of detail is what I think of as a "properties" of the chip. Once we understand how the chip implements memory (from the perspective of what the computer user calls memory) then we focus on a different level of detail to estimate it.

In animals, there is the problem of distinguishing between memory and adaptive behavior. For example, in the Pavlovian type of conditioning animals "learn" certain behavior through experiences, but this doesn't mean they have much memory of those experiences. Does a mechanic who "remembers" how to take apart an engine do this only by recalling a collection of facts, or does he do it mainly by exercising certain patterns of behavior? As another example, which fact has more information: your maternal gradmother's middle name or the fact that matrix multiplication is associative?

9. Jun 26, 2011

### Stephen Tashi

We don't?
I do.

10. Jun 26, 2011

### Hells

Perhaps there only needs to be a few tens of bits to record a memory. An ordering of concepts that are stored in special longterm memory.

Like a puzzle, it stores which bricks you need, but not the shape and size of the brick, and the blanks get filled in by an approximation akin to how your visions sometimes tricks you because the brain is filling in the gaps.

11. Jun 26, 2011

### Ryan_m_b

Staff Emeritus
Perhaps though I think a few tens of bits is too low. It could also be that as our brain remembers mnemonically new memories don't take up much memory because they are linked in with old.

However thanks to the limitations in our knowledge described above it's impossible to have a meaningful conversation to address the OP's question. Anything else is just speculation.

12. Jun 26, 2011

### mishrashubham

Well the mechanic does rely on information stored in his brain, albeit of a different kind. What you call 'patterns of behaviour' is also known as Muscle memory.

please clarify if that is not what you meant.

13. Jun 26, 2011

### phyzguy

I understand and agree with the rules. This is not my personal opinion, it is a widely held view in neural research, since the pioneering work of Hebb in 1949. Here are two peer-reviewed articles, but a search will find many others.

(1) Lisman, John, Proc. Natl. Acad. Sci. USA
Vol. 86, pp. 9574-9578, December 1989
Neurobiology, Department of Biology, Brandeis University, Waltham, MA 02254

Here's a quote from the first paragraph,

"In many types of neural network models, learning occurs by
the bidirectional modification of synaptic weights according
to simple activity-dependent rules that resemble the Hebb
and anti-Hebb rules described below. Such networks can
store multiple memories (1), develop selectivity for input
patterns (2), find optimum solutions (3), and organize topo-
graphic maps (4). These theoretical results suggest that
understanding how synaptic weights are stored and modified
is important for understanding brain function in general, and
learning and memory in particular."

(2) Brunel, Nicolas, et.al., Neuron, Vol. 43, 745–757, September 2, 2004, Neurophysique et Physiologie, Universite Rene Descartes, 45 rue des Saints Peres, 75270 Paris, France.

Here's a quote from the first paragraph,

"Distributed changes of synaptic efficacy are thought to
underlie much of the learning and memory occurring
in the brain (Hebb, 1949). The distribution of synaptic
weights should thus be related to what has been learned
and the manner of its learning. However, few studies
have attempted to exploit this link, despite measure-
ments of synaptic weight distributions becoming avail-
able for an increasing number of connections."

14. Jun 26, 2011

### Staff: Mentor

This does not support you.

15. Jun 26, 2011

### phyzguy

Can you kindly explain why not? My original statement was that, if memory is stored as synaptic weights, then I presented a possible calculation of how much memory can be stored in the brain. I presented two papers that clearly state that synaptic weights are thought to relate to memory storage in the brain. Quoting again, 'Distributed changes of synaptic efficacy are thought to underlie much of the learning and memory occurring in the brain (Hebb, 1949).' What is it you are objecting to? Is it the calculation of total synaptic weights? Or is it the thesis that synaptic weights are thought to be the location of memory storage?

16. Jun 26, 2011

### Staff: Mentor

Point out where they specifcally showed that memory capacity in the brain can be calculated, as you claimed. I don't see it.

17. Jun 26, 2011

### Pythagorean

"synaptic weights" are tip of the ice berg. Firstly, they're not fixed quantities and they don't have fixed states (ltp and ltd are examples of this).

There are several modulatory aspects to neurons on the molecular level that cause slight different protein conformations that produce slightly different micro results that can have significant macro consequences (this is largely the focus of pharmaceuticals nowadays: target-specific allosteric modulation)

There are also diffusive information flows, generally laterally across networks that act as analog filters, canceling asynchronous signals across a sensory organism.

Eventually, you have to discuss Markov states when considering receptor kinetics.

All of this seems to strongly refute a digital view.

18. Jun 27, 2011

### phyzguy

I admit that my calculation of potential memory capacity was speculative, although I think it was well founded. Since it appears I have run afoul of the rules, I withdraw my estimate. Apologies if I muddied the water.