A question about Bionic eyes and the visual cortex.

  1. Something that I've long wondered about is about the implications within our brain and visual cognition of implanting bionic eyes that would process the entirety of the EM spectrum.

    What is the consensus of the scientific community regarding this hypothesis, would our visual cortex instantly be able to interpret and perceive stimuli from the wholeness of the EM spectrum, or is it instead that there are default limitations within out brains that better eyes would not be able to compensate?
  2. jcsd
  3. mfb

    Staff: Mentor

    You would need some way to connect the eye to the brain - probably via the existing neurons for visual information. In that case, you can freely choose which color should represent what, assuming you can distinguish all those neurons.
  4. So, in other words, a human could exchange their biological eyes for bionic eyes, and if properly connected to the visual cortex, this human would be able to wake up after the procedure and instantly perceive the whole electromagnetic spectrum without confusion or seizures?

    I ask this for a simple reason, often, when we go to the cinema and are exposed to art films with rapid transitions and unconventional art direction, a considerable amount of people will feel nauseated and suffer seizures.

    From what I can understand those seizures have nothing to do with our eyes, but rather with some inability from our brains to interpret such rapid camera movements and color variation.

    If over night we were able to see the entire EM spectrum, wouldn't we suffer from similar side effects?
  5. mfb

    Staff: Mentor

    Are you confused if you look at the image of a thermographic camera? It does the same - it captures infrared radiation (as an example) and sends visible light to your eyes, this gets translated to neuron activations in the eye, and those are sent to your brain. The bionic eye would just skip the intermediate step of visible light.

    Rapid camera movements and brightness/color variation is an issue for the brain, but you could program the eye to avoid that.
  6. But when you look at the thermgraphic camera your eyes are still only collecting visible light, if all of a sudden we were able to perceive the whole EM spectrum we would be exposed to visual stimuli of all directions, the common visual world that we are used to would become so diluted that it would become almost indistinguishable.

    By perceiving the whole EM spectrum we would see microwaves, radio waves, everything radiating heat, from trees, to people to inanimate objects.

    We would not only be exposed to all forms of light sources as we would wake up being exposed to a completely different reality than the one we see with our biological eyes.

    Currently, the visible spectrum that we perceive is approximately 3% of the entire EM spectrum, by being exposed to it in its entirety, the current visible spectrum would become so diluted and bland to such degree, that our current visual cognition of reality would be much less noticeable than two almost indistinguishable shades of white.

    Is it safe to assume that such a paradigm shifting in visual cognition wouldn't have tremendous implications to our brains?
  7. mfb

    Staff: Mentor

    That just depends on the program code of the eye. You would probably use different modes.

    Where does that value of 3% come from? I don't see a meaningful way to assign a percentage number to that.

    Most of the spectrum is boring - gamma, x-rays and hard UV are rare, near UV will give some more information. Infrared radiation is very interesting, Terahertz radiation might be interesting, and for everything below that the eye cannot provide the necessary resolution to see structures.
    And that assumes a "perfect" eye - it will be hard to implement many different detectors and the corresponding electronic in such a small space.
  8. Another potential problem overlooked: the relationship between a radiation's wavelength/bandwidth and sensor size. I had always understood that the larger the wavelength or bandwidth sought to be gathered required a correspondingly larger sensor, to be efficient. To be a viable replacement, a synthetic or bionic eye would have to be the approximate size/diameter of a biological eye. Therefore wouldn't this size restriction be a self-limiting factor?
  9. mfb

    Staff: Mentor

    Right, the spatial resolution for longer wavelengths would be worse compared to visible light. On the other hand, do you need that resolution?
  10. "Need" is a bit ambiguous here. From a technological wish-list point of view, you would presumably want the "eye" as versatile and robust as possible. Ideally an artificial eye better than the real thing. Long-focus capability would be particularly cool but this could be approached digitally to save space. This approach, however, would require greater resolution. As regards long wave radiation, a certain minimum resolution would be needed to make the eye useful, i.e., if you wanted to "see" in those wavelengths. Another problem occurs to me. The human eye, in the visible spectrum, perceives the world in terms of color (corresponding to wavelength) and the quality of light (bright versus dark, corresponding to the intensity of light). Radiation beyond the visible spectrum would require the perception and appreciation of "new" colors. This "perception and appreciation" does not, of course, take place in the eye but is a result of the brain's interpretation of stimuli generated by the eye. The eye itself, which in its natural form is really just a glorified light collection device, does very little in the way of interpreting data. In the perception of radiation beyond the visible spectrum, this could be a non-starter.
  11. mfb

    Staff: Mentor

    Well that's not just a limitation of the receiver, it is a limitation of the emitters as well. At least indoors, you would not see anything interesting with radio waves, for example.
    For coherent radiation, it is possible to detect the direction with multiple smaller receivers inside the eye.

    Well, I think you would have some way to control your eyes with the brain, like "okay, now give me an overlay of UV shown in blue".
  12. I was imagining a seamless, all-in-one device that would allow the simultaneous perception of all wavelengths. But the retread of the visible spectrum for use by (or the perception of) the non-visible spectrum might be the only plausible workaround.
  13. First a technical note: We don't really get all the information there is from the visible spectrum. Instead, we see the combination of three weighted bands.

    I am not going to deal in the mechanical practicalities.

    Here is scenario 1: You only get to see one set of wavelengths at a time, but your wired up to switch from one set to any other set as simply as glancing from one room to the next. In this case, there is no hard problem. This would be just like looking through glasses that could project whatever you wanted.

    Here is scenario 2: All the sensors are wired up at the same time. This is a big problem. Much of the visual cortex is allotted on a per-"pixel" basis. So more pixels would mean more cortex is required.
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?

Draft saved Draft deleted