Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Medical The Blue Brain Project

  1. Jul 24, 2009 #1
    I ran across this interesting article.

    http://news.bbc.co.uk/2/hi/technology/8164060.stm

    EDIT: I think the tech writer got something wrong when he wrote 1 laptop=1 neuron in terms of Hz at least. I really don't what was meant by that. A 10,000 neuron network might work for a snail. If they're considering the processors/laptops as "super neurons" the number of interconnections will still be way too small to model the brain. Perhaps someone can enlighten me.
     
    Last edited by a moderator: Aug 5, 2009
  2. jcsd
  3. Jul 24, 2009 #2

    berkeman

    User Avatar

    Staff: Mentor

    I hope they are factoring in the advances in Functional MRI scans of the brain -- pretty interesting progress recently!

    http://www.fmri.org/fmri.htm [Broken]

    .
     
    Last edited by a moderator: May 4, 2017
  4. Jul 24, 2009 #3
    I think they're using a lot of fMRI data in their reconstructions of neocortical columns and other forebrain structures. I also think the headline is a bit misleading. I don't think they're trying to reconstruct the whole brain, but focusing on the neocortex where the "highest intellectual" processing occurs. This is the brain structure that has undergone the greatest expansion relative to our recent ancestors, not to mention rats. However, many inputs from other parts of the brain (Broca's area, the hypothalamus, thalamus, etc) might be simulated by software, for example, language acquisition. I have some background in neuro-pharmaceutical R&D, but don't understand the full scope of this project.

    In terms of computational issues, whole brain modeling would involve the processing power of 10^11 neurons at about 200 Hz each or about 10^7 MHz. Assuming 10,000 laptops at 2.5 GHz each, they would seem to have enough processing power.



    http://acceleratingfuture.com/michael/blog/2009/06/computing-power-does-matter-for-ai/
     
    Last edited by a moderator: May 4, 2017
  5. Jul 24, 2009 #4

    Borek

    User Avatar

    Staff: Mentor

    Laptops at 5 MHz? The very first portable I have ever used back in eighties was 12 MHz...
     
  6. Jul 24, 2009 #5
    You guys are too fast. I was in the process of editing. Gimme a few minutes with this.
     
  7. Jul 24, 2009 #6
    OK You can look now. I'm on a slow wireless network.
     
  8. Jul 25, 2009 #7

    Borek

    User Avatar

    Staff: Mentor

    While I definitely agree on the fact that we have (or will have soon) enough computing power, it seems to me we still lack some crucial knowledge - and that makes me doubt our ability to create AI so fast.

    There is a small insect here, Ammophila sabulosa (see http://en.wikipedia.org/wiki/Sphecidae), that knows how to build a nest in the ground, how to mask it, how to find it later (and it flies in the many meters radius) and open it, how to paralyze a large caterpillar, how to put an egg in it, how to transport the caterpillar back to the nest. That all apart from knowing how to feed itself, fly and mate. Quite an achievement when you take into consideration fact that it has brain with just several millions of neurons. So far I have not heard we are able to simulate it - and it should be easily doable with the computing power at hand.

    Another take on the same subject - there were several attempts at making a robotic vacuum cleaner or lawn mower. You need AI for that - AI comparable with a small insect. Again - we have processing power, we don't have robotic vacuum cleaners. (OK - I have seen them sold but obviously they were not good enough to survive on the market).

    So my take is we are still missing something. I am sure we will get it sooner or later, we are just not there yet.
     
  9. Jul 25, 2009 #8
    I think the point Michael Anissimov (link post 3) was making is that computing power has been undervalued by many critics of AI. His view of the "middle way" is the combining of processing power with high resolution 'brain scans' (really fMRIs) to get the right architecture. Henry Markhram is a heavyweight in this area, so while 10 years may be optimistic, I think he's on to something. I'm sure there's much more to this project than this short article even hints at. For example, the 10,000 laptops is a bad analogy (far too few interconnections between high power nodes). Brainpower results (most believe) from the extremely dense network of interconnections between relatively low power nodes.

    http://bluebrain.epfl.ch/

    EDIT: OK. Now I think I understand the one computer=one neuron bit. It takes one laptop, apparently, to store all the descriptive information for one neuron and it seems there are 10,000 distinct neurons in one neocortical column; the basic repeating unit of mammalian neocortical structure.

    http://www.neuroinformatics2008.org/congress-movies/Henry%20Markram.flv/view [Broken]
     
    Last edited by a moderator: May 4, 2017
  10. Jul 27, 2009 #9
    I'm re-posting this link because it was added as a late edit on the previous post. It's a lecture by Henry Markram on the Blue Brain Project lasting about 1 hr 20min including questions.

    http://www.neuroinformatics2008.org/congress-movies/Henry%20Markram.flv/view [Broken]
     
    Last edited by a moderator: May 4, 2017
  11. Jan 14, 2011 #10
    I'm coming into this discussion late but this is an ongoing project so what the hay.

    I think fMRI is not important at this stage in the project and here's why: fMRI gives you an overall picture of brain activity in a non-invasive way. What Henry Markram is trying to do is create the most accurate synthetic neurons possible. One technique he is using is patching tiny electrodes directly onto individual neurons of living rat brains to record the electrical activity. He has automated robots doing this on a large scale: sending electrical signals and recording the response. Each one patched into a different neuron type.

    I'm sure this number has changed dramatically but I heard that each neuron has 200 separate simulations going on to approximate the behavior of a real neuron of the same type. This is why one consumer level computer can only simulate one neuron. I am sure, as the simulations improve, the computing power required will drop. However, from what I have heard, Markram wants to take it to the next level and simulate at a molecular level. This would obviously increase processing requirements because then you have to understand and simulate ion exchange, how these ions are transferred through the axon membranes and any other molecular processes important to information exchange and storage. Since he wants to model brain diseases, even molecular processes unimportant to information exchange and storage become important. For instance, activity inside the cell nucleus becomes important in understanding Alzheimer's.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: The Blue Brain Project
  1. The brain (Replies: 1)

  2. The brain (Replies: 2)

  3. The Brain (Replies: 23)

  4. Brain projection (Replies: 18)

Loading...