Powers of The Brain

  • Medical
  • Thread starter Nuklear
  • Start date
  • #1
65
0

Main Question or Discussion Point

The brain is supposed to be the most sophisticated piece of electronics. It can do things we can't dream with our current computers.

I want t know how many of the worlds most sophiticated computers would it take to equal the output of our brain proccesing. Is our extreme computational power do to our internal programing or that we have far more neurons than chips have transistors>

From what I heard todays most powerful cumputer doesn't have the brainpower of a housefly. Is it because neurons are so much smaller than transistors? Which is faster our brain or a computer? I'm guessing our brain has far more memory.
 

Answers and Replies

  • #2
The brain is supposed to be the most sophisticated piece of electronics. It can do things we can't dream with our current computers.

I want t know how many of the worlds most sophiticated computers would it take to equal the output of our brain proccesing. Is our extreme computational power do to our internal programing or that we have far more neurons than chips have transistors>

From what I heard todays most powerful cumputer doesn't have the brainpower of a housefly. Is it because neurons are so much smaller than transistors? Which is faster our brain or a computer? I'm guessing our brain has far more memory.
Our brain has a pretty much infinite memory, considering our lifespans, there are estimates but there not exactly based on anything concrete, more guestimates. And no one entirely understands how the brain does what it does so effectively, or even how memories are stored, so we're pretty much in the dark at the moment - making good progress on finding out which parts of the brain do what, but not getting too far on exactly the processes involved.

Computers are still pretty dumb when it comes down to it, but then evolution has had nigh on 4 billion years of dead ends and misadventures to end up with something this complex.

Computers are useful for doing extraordinarily complex calculations quickly, they pretty much rule in this regard, and for storing vast amounts of easily accessible information without the need for vast warehouses of paper, or for doing mind numbingly repetitive experiments, or for sorting information etc. But for most other things, particularly those involving the creative process or abstract thinking, it's better to rely on the human mind, at least for now.

Here's an interesting web site I found:-

Bear in mind that by its nature such guesses are very speculative anyway, so this sort of theorising should not be taken as SCIENCE.

http://www.totse.com/en/fringe/fringe_science/merkle1.html [Broken]

How Many Bytes in Human Memory?
by Ralph C. Merkle

(appeared in Foresight Update No. 4, 1988)

(merkle.pa@xerox.com)

Today it is commonplace to compare the human brain to a computer, and the human mind to a program running on that computer. Once seen as just a poetic metaphore, this viewpoint is now supported by most philosophers of human consciousness and most researchers in artificial intelligence. If we take this view literally, then just as we can ask how many megabytes of RAM a PC has we should be able to ask how many megabytes (or gigabytes, or terabytes, or whatever) of memory the human brain has.

Several approximations to this number have already appeared in the literature based on 'hardware' considerations (though in the case of the human brain perhaps the term 'wetware' is more appropriate). One estimate of 10**20 bits is actually an early estimate (by Von Neumann in 'The Computer and the Brain') of all the neural impulses conducted by the brain during a lifetime. This number is almost certainly larger than the true answer. Another method is to estimate the total number of synapses, and then presume that each synapse can hold a few bits. Estimates of the number of synapses have been made in the range from 10**13 to 10**15 -- with corresponding estimates of memory capacity.

A fundamental problem with these approaches is that they rely on rather poor estimates of the raw hardware in the system. The brain is highly redundant and not well understood: the mere fact that a great mass of synapses exists does not imply that they are in fact contributing to the memory capacity. This makes the work of Thomas K. Landauer very interesting for he has entirely avoided this hardware guessing game by measuring the actual functional capacity of human memory directly ('How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-term Memory' in Cognitive Science 10, 477- 493, 1986).

Landauer works at Bell Communications Research -- closely affiliated with Bell Labs where the modern study of information theory was begun by C. E. Shannon to analyze the information carrying capacity of telephone lines (a subject of great interest to a telephone company). Landauer naturally used these tools by viewing human memory as a novel 'telephone line' that carries information from the past to the future. The capacity of this 'telephone line line' can be determined by measuring the information that goes in and the information that comes out -- the great power of modern information theory can be applied.

Landauer reviewed and quantitatively analyzed experiments by himself and others in which people were asked to read text; look at pictures; hear words, short passages of music, sentences and nonsense syllables. After delays ranging from minutes to days the subjects were then tested to determine how much they had retained. The tests were quite sensitive (they did not merely ask 'What do you remember?') often using true/false or multiple choice questions, in which even a vague memory of the material would allow selection of the correct choice. Often, the differential abilities of a group that had been exposed to the material and another group that had not been exposed to the material were used. The difference in the scores between the two groups was used to estimate the amount actually remembered (to control for the number of correct answers an intelligent human could guess without ever having seen the material). Because experiments by many different experimenters were summarized and analyzed, the results of the analysis are fairly robust; they are insensitive to fine details or specific conditions of one or another experiment. Finally, the amount remembered was divided by the time alloted to memorization to determine the number of bits remembered per second.

The remarkable result of this work was that human beings remembered very nearly two bits per second under ALL the experimental conditions. Visual, verbal, musical, or whatever -- two bits per second. Continued over a lifetime, this rate of memorization would produce somewhat over 10**9 bits, or a few hundred megabytes.

While this estimate is probably only accurate to within an order of magnitude, Landauer says 'We need answers at this level of accuracy to think about such questions as: What sort of storage and retrieval capacities will computers need to mimic human performance? What sort of physical unit should we expect to constitute the elements of information storage in the brain: molecular parts, synaptic junctions, whole cells, or cell-circuits? What kinds of coding and storage methods are reasonable to postulate for the neural support of human capabilities? In modeling or mimicking human intelligence, what size of memory and what efficiencies of use should we imagine we are copying? How much would a robot need to know to match a person?'

What is interesting about Landauer's estimate is its small size. Perhaps more interesting is the trend -- from Von Neumann's early and very high estimate, to the high estimates based on rough synapse counts, to a better supported and more modest estimate based on information theoretic considerations. While Landauer doesn't measure everything (he did not measure, for example, the bit rate in learning to ride a bicycle nor does his estimate even consider the size of 'working memory') his estimate of memory capacity suggests that the capabilities of the human brain are more approachable than we had thought. While this might come as a blow to our egos, it suggests that we could build a device with the skills and abilities of a human being with little more hardware than we now have -- if only we knew the correct way to organize that hardware.
http://www.springerlink.com/content/r1q573654pw65373/

Another paper, theorising on memory capacity.

[tex]10^{8432} bits[/tex]

Works out at

[tex]1.1641532^{8422}GB[/tex]

That's quite a lot :smile:
 
Last edited by a moderator:
  • #3
453
0
10^15 bits at 10^16 OPS is the average information capacity of the brain among cog sci and AI researchers as pointed out above from http://www.merkle.com/brainLimits.html

10^8432 bits is 10^8342 times as much information is in the observable universe down to quantum events! [10^90 bits] that paper was rather - strange- BTW the Beckenstien Bound limits the amount of information that can be contained in any region of space- configured as an ultradense ultrahot computer the 1 kg brain could only represent about 10^31 bits at 10^51 OPS- with every quantum observable spin of every particle harnessed as a bit http://www.edge.org/3rd_culture/lloyd/lloyd_index.html the amount of information that matter can represent has been well known and quantified since Maxwell and Boltzmann- I think Wang/Liu/Wang are basing their estimates on unbounded geometrical progressions of synaptic connections without taking into account the physical limits imposed by entropy- the real brain can only utilize the bits of it's 10^14-10^15 synapses- therefore at any given time the maximum amount of human memory must be configured as a percentage of those connections only-
 
Last edited:
  • #4
10^15 bits at 10^16 OPS is the average information capacity of the brain among cog sci and AI researchers as pointed out above from http://www.merkle.com/brainLimits.html

10^8432 bits is 10^8342 times as much information is in the observable universe down to quantum events! [10^90 bits] that paper was rather - strange- BTW the Beckenstien Bound limits the amount of information that can be contained in any region of space- configured as an ultradense ultrahot computer the 1 kg brain could only represent about 10^31 bits at 10^51 OPS- with every quantum observable spin of every particle harnessed as a bit http://www.edge.org/3rd_culture/lloyd/lloyd_index.html the amount of information that matter can represent has been well known and quantified since Maxwell and Boltzmann- I think Wang/Liu/Wang are basing their estimates on unbounded geometrical progressions of synaptic connections without taking into account the physical limits imposed by entropy- the real brain can only utilize the bits of it's 10^14-10^15 synapses- therefore at any given time the maximum amount of human memory must be configured as a percentage of those connections only-
OK I'm going to go with these more realistic figures, but how many bits can the best supercomputer store, I wonder, that would be interesting, I had a lot of trouble trying to finding out I think I got 16 terabytes of on board memory, and 162 terabytes of hard drive space, but this was an old website.

I think he's talking about the number of connections that could be formed given x amount of time, not the amount given x lifespan? Although I'm not sure...I doubt we're talking about quantum information here?

I'm sure the complexity of the brain would make it a bit more than your figures, given we have no idea how the brain stores memory, but let's face it this is entirely in the realm of speculation.

Don't forget we're not just talking about any given time, but also about the more long term store.
 
Last edited:
  • #5
13
0
Well, the top supercomputer in the world at the moment is the IBM BlueGene/L, which has the capability to operate at 280.6 teraflops [record]. It has 65,536 nodes, with two CPU's per node. There's 512MB of DDR RAM per node, so 65,536*512 = 32TB of memory. It also has a total of 806TB of hard drive space.
 
  • #6
453
0
I think he's talking about the number of connections that could be formed given x amount of time, not the amount given x lifespan?

I think you are on the right track- in fact I think they base their calculation on the possible states that a 10^15 bit system can have- which would be 2^10^15 possible configurations- [someone with a more powerful calculator than Google will have to tell us if 10^8432 is close to 2^1000000000000000 ]

if so this is more than a potential memory capacity as the set of possible states would by definition include every active brain state of everyone who had ever lived in every moment of their lives- in addition to every possible moment of any human who could have lived in any universe- in addition to any and every possible mind state of any other human level intelligence either naturally evolved or artificially created in every possible universe- and that would still be only a tiny sub-set of the possible states of 10^15 bits- which would also include a great many more useless states that don't correspond to ANY form of intelligence- or even any form of definable matter-
 
Last edited:

Related Threads on Powers of The Brain

  • Last Post
Replies
2
Views
1K
Replies
17
Views
8K
Replies
17
Views
17K
  • Last Post
Replies
2
Views
2K
Replies
7
Views
4K
Replies
10
Views
5K
  • Last Post
2
Replies
31
Views
6K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
2
Views
2K
Top