How are the electrodes in a brain machine interface placed?

  1. I have a couple questions actually.
    One I was wondering in what part of the brain would an interface that allowed you to access the internet be placed in the future? It's not like the brain has an expansion slot.

    Two how do they decide where to place electrodes in current interfaces? Do they just say hey this looks like the right part of the brain and stick it in, or do they target a specific neuron.

    Three how do they actually stimulate the neuron? Do the electrodes penetrate the cell or do they just stimulate it with an em field.
     
  2. jcsd
  3. Ryan_m_b

    Staff: Mentor

    You can't access the internet via your brain even with electrodes strapped to it. You could control a mouse cursor on a computer but I doubt you're asking that. Anything more speculative is beyond the scope of this forum.

    Neuron's are too small to target specifically. Electrodes are placed in a specific area of the brain instead.
     
  4. Hmm,

    First off, these electrodes in the brain READ signals, they take INPUT from the brain.

    That means, there is no way to WRITE information on the brain, thus you can't be connected to the internet directly, since there is no way to write visual/audiatory/touch information to the brain.

    The brains input are; Eyes, ears, nose, tounge and the spinal chord.

    In order to connect the brain directly to the internet you would need to "hyjack" all those inputs, which is simply not possible with modern bio-tech. First off because we don't even know the "code" the eyes use, we have no idea how to write signals to the brain in form of visual information, this is something biology does but how it does it is mostly a mystery.

    Plus, hyjacking inputs to the brain would likely mean your only vision would be through the machine, since now your eyes are disconnected from your brain.
     
  5. No. Anytime any invasive interference in normal brain brain function is considered, there is a very thorough process of stimulation mapping to decide where to implant the electrodes. While we can generally delineate regions of the cortex such as the primary motor cortex and it's subareas through stereotaxic maps, etc., there is sufficient inter-individual variability that these regions need to be mapped out electrophysiologically, typically by patients who are awake under local anesthesia.
     
  6. Here is an article where an rat received a brain implant connected to an infrared sensor:
    http://www.nicolelislab.net/wp-cont...Rats-the-Ability-to-Touch-Infrared-Light1.pdf
    The IR information was connected to an area normally use by the rats for tactile sensations from his whiskers. The experiment demonstrated that, over the course of a month, the rats were able to learn to use the new IR sense - and it suggests that this was not at any cost to their ability to sense through its whiskers.

    It is difficult to determine what would be a "reasonable extrapolation" of these results, but I will take a shot at it.
    In the experiment, information was fed to the rat's brains as a analog signal, FM-encoded, and updated (ie, sampled) at 20Hz.

    Normally, web information is presented in a GUI (Graphical User Interface), but in this case, we would be presenting the data using a different "media". We can call it a Cortical User Interface (CUI).
    A key parameter to using any interface is knowing the information bandwidth - how fast can you talk to it. No useful attempt was made to measure that with the rats. But if we are tapping into the tactile human sense, the ability of people to use braille gives us a clue.
    Here is an interesting website. It not only describes how many words per minute can be learned, but the site itself is intended to be braille friendly:
    https://nfb.org/Images/nfb/Publications/bm/bm99/bm990604.htm
    Braille is a six-pin code. For our brain interface, we may want to start with tried methods, so let's do six electrodes. 200 to 400 words per minute is about 20 to 40 characters per second. Interesting close to the 20Hz used with the rats. Ideally, the data transfer rate would be under the control of our user - so however our user selects the content they are reading, they will also need to be able to control the transfer rate.

    Of course, the most obvious use for this would be for someone who cannot use braille - because they lack a sense of touch or cannot control motion. But, by bypassing sensory neurons, there is the potential for it to make braille-like input easier and faster to interpret than tactile braille.
     
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?

0
Draft saved Draft deleted