Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Medical Brain: Visual vs. Auditory

  1. Jun 8, 2010 #1
    First I'd like to thank some frequent contributors to brain topics in the Physics Forums:

    (in unbiased alphabetic order)


    Some great discussions on brain topics here! Based on frequency of the word "brain" in subject headings I would say it's the most popular topic in the Medical Sciences section. Does anyone know if serious collaboration ever originated in Physics Forums? It would be pretty cool!

    getting to the point

    I find it remarkable that vision and sound are such vastly different conscious experiences, yet both are described by neuronal communication.

    I'm expecting the consensus is "No one knows why this is", so it would be helpful if anyone had links/references regarding differences between neuronal activity in visual vs. auditory systems (e.g. firing patterns, chemical properties, etc).

    Just a first guess is perception is vastly different because signaling in the visual system occurs much faster than in the auditory system (since cells are interpreting light frequencies vs. sound frequencies). But that's strictly intuitive argument hence the request for more sound research. Any help would be greatly appreciated.

    Thank you.
  2. jcsd
  3. Jun 8, 2010 #2


    User Avatar

    Staff: Mentor

    We tried a "mind and brain" forum but no one posted to it, so it was scrapped.

    Medical problems would go in Medical. Discussions of the brain and/or neurology should go in biology.
  4. Jun 8, 2010 #3
    That doesn't sound right. The signal comes either from a nerve that is located as to be triggered by sound in a range of wavelengths, or located as to be triggered by light in a range of colours. But the frequency of the neural signal (I guess you would mean how often a neuron fires) has nothing to do with the wave-frequency that the sense corresponds with. It is nothing like a digital oscilloscope (which actually does monitor amplitudes with fine time-resolution).

    Dunno, but PF generally forbid discussion of unpublished research, frown on highly active threads, and usually lock any topic the admins don't already understand, so it's probably only conducive to collaboration among the admins. By design PF is never more than it was explicitly intended to be.
    Last edited: Jun 8, 2010
  5. Jun 9, 2010 #4
    The inportant difference doesn't seem to be neuronal activity so much as cytoarchitecture. Brodeman found the organization of the cells was different in different dedicated areas:

    So, they have, at least, found that the "architecture" of the cells plays an important role in how the signal is experienced.
  6. Jun 9, 2010 #5


    User Avatar
    Gold Member

    As said, in general, all brain cells react much the same and so the qualitative differences must be down to the organisation of the processing. Visual processing is retinotopic - the mapping of what the eyes see to the primary visual cortex preserves the spatial information. Aural processing is tonotopic - the way the cochlea breaks down response into frequency is preserved, so that high tones are dealt with at one end, low at the other.

    So neurons are general components, and the processing in the brain physically preserves some of the felt structure of the stimulus, which is at least a start on explaining why similar components can generate different experiences.

    Then each stream of sensory processing actually involves hierarchies of "processing modules". For vision, you have at least 30 doing things like representing motion, colour, etc. For hearing, there are far fewer. And so vision feels like a richer experience than hearing. There is more kinds of analysis going on.

    A final point would be that what we actually experience is a mental construction. Colour for example is just an invention, not some literal transduction of the stimulus. So even though the general spatial and temporal structure of the world is being preserved in the processing, in the end, we still have a long way to go to be able to say why red IS red, or the chime of a bell is how we experience it.

    We can look at the brain circuits and predict something about the nature of the experience. This is what people are trying to do right now with things like bat echolocation - go from the neural architecture to some sense of what it is like to be a bat seeing via sound. But it does not seem likely that we can get a complete sense of explanation from this bottom-up (from neurons to experience) kind of account.
  7. Jun 11, 2010 #6


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Processing of the two sensory systems happens in completely different parts of the brain.

    The eyes are actually developed from the nervous system itself. The optic nerves (carrying the visual part of the information from the eye) synapse in a part of the thalamus called the lateral geniculate nuclei. The post-synaptic neurons follow a few different pathways, but the primary one for visual information extends along what are called the optic radiations to the occipital lobe of the brain, which is the lobe of the cortex all the way at the back of the brain (the part of information about light that is involved in things like pupil reflexes and synchronizing circadian rhythms follows other paths, but I don't think these are the ones you're asking about).

    Auditory information travels along a different cranial nerve that first enters the brainstem in the cochlear nucleus (in the pons). It has a more indirect pathway than the visual system, so does a few hops, skips and jumps through synapses of neurons in the superior olive (near the junction of pons and medulla) to the inferior colliculus in the midbrain, then up to the medial genicular bodies of the thalamus (next to the lateral genicular bodies where visual information first enters), and then ends up being processed in the temporal lobe of the cortex, which is along the side and bottom of the brain.

    You can readily find pictures of the lobes of the cortex all over the internet to see where the occipital and temporal lobes are. The part of the temporal lobe that receives auditory information is right under the parietal lobe if you look it up, tucked into what you'll find labeled as the lateral fissure.

    As for your question about collaborations among PFers, those of us who are actually doing research in neuroscience are all studying vastly different subjects, and others are not actually neuroscientists, but have a keen interest in the subject and have read a lot but are not actively doing any type of research in it. So, no, we're not collaborating on any projects, we just enjoy picking each others' brains from time to time. :wink:
  8. Jun 12, 2010 #7


    User Avatar
    Gold Member

    I understand that the op mentions differences in the experiences of sight and hearing resulting from external stimuli, but since I mentioned it recently (post #176 of the Synesthesia Thread in Medical Sciences), I understand the McGurk effect may be an example of visual stimuli affecting heard experiences.


    (picked this version because it had an amount of explanation, regardless of what else is said)
  9. Jun 12, 2010 #8


    User Avatar
    Science Advisor

    Some very rough, exaggerated ideas, just to illustrate the range of possibilities:

    1) The brain is all the same, but the relationship between "objects in the real world" (things that you interact with via actions) and the way they can be inferred via visual or auditory signals on the retina or cochlea is different. If you wired visual signals to the auditory cortex, it would end up like visual cortex.

    I like this line of thinking about how we use audition to infer what an "object" is http://www.staff.ncl.ac.uk/t.d.griffiths/Griffiths_NRN_2004.pdf [Broken]. You can try the same sort of logic on images to see what you get, this is the sort of thing people in computer vision do http://iris.usc.edu/Vision-Notes/bibliography/contents.html.

    2) Or maybe the brain areas really are different, so even if you sent the "same" signals to those areas, you would get different experiences.
    http://www.ncbi.nlm.nih.gov/pubmed/11694887 (this paper compares audition and touch, not audition and vision, but the idea is the same)
    Last edited by a moderator: May 4, 2017
  10. Jun 13, 2010 #9
    Hehehehe. The ferrets looked like they were having a visual experience.
  11. Jun 13, 2010 #10


    User Avatar
    Science Advisor

    Last edited by a moderator: May 4, 2017
  12. Jun 13, 2010 #11


    User Avatar
    Gold Member

    Last edited by a moderator: May 4, 2017
  13. Jun 13, 2010 #12
    What's it like, or what does it look like?

    Very interesting and mysterious. One of the people I met by accident at the cafe happens to be a neurologist specializing in vestibular disorders. Next time he comes in I'm going to ask if he's heard about this.
    Last edited by a moderator: May 4, 2017
  14. Jun 13, 2010 #13


    User Avatar
    Science Advisor

    What's it feel like - ie, did the visual stimuli look visual, or did they sound like sounds? Of course we can't know, but presumably the verbal responses of people who've experienced sensory substitution would be informative.
    Last edited: Jun 13, 2010
  15. Jun 13, 2010 #14


    User Avatar
    Gold Member

    One field you might want to study in depth is neuroplasticity and how the visual and auditory cortexes will compensate for one another in the event of blindness or deafness.

    Here's a link to the Neurosciences division of the U of O where all the primary research has been done with regard to neuroplasticity:


    I've given you the publications page and recommend this PDF on that page...

    Neville, H. and Sur, M. (2009). Neuroplasticity. In M. Gazzaniga (ed), The Cognitive Neurosciences IV, MIT Press, Cambridge, pp. 89-90. [pdf]
  16. Jun 14, 2010 #15
    Yes, but in the case of the vestibular disorders the people's brains weren't literally rewired like the ferrets. Learning to use sensations on the tongue to keep your balance would, I think, be like any indirect control: watching a speedometer for example. The article mentions the analogy of having someone following you with a finger touching the center of the top of your head. If you tilt you can feel the position of the finger touch change and correct to bring it back to center. I didn't get the impression from the article that normal balance had returned, just that, for some mysterious reason, the tongue-balance effect lasted after the device was removed.
  17. Jun 14, 2010 #16
    One thing to note here is that the apparent use of the visual cortex by blind people, for example, which will show up on a brain scan, isn't necessarily for auditory purposes. Sacks reports in Musicophilia that blind people actually have masses of unformed, elementary visual experiences. They can't see, but they have an abstract visual world, rather like synesthesia. Likewise, but also contrariwise, people who go deaf often experience musical hallucinations, the non-stop experience of vividly hearing music. Not an abstract world of sounds, but repetitious, stereotyped hallucinations. Then there is the phantom limb phenomenon, where the parts of the cortex formerly responsible for the movement and sensation of the missing limbs continue to behave as if the limb were still there, such that the person, for all intents and purposes, feels that it is, contradicted only by the sight of it being missing and the failure of external objects to react to actions made by the phantom.

    So, it seems that dedicated brain areas actually resist performing non-dedicated functions. Plasticity is most evident when a dedicated area becomes much better at its dedicated function, as seen in the amazing hearing of people with William's Syndrome, who are born with underdeveloped occipital lobes. There's a hardware cap on their visual abilities, but their hearing can become almost superhuman. Most of them suffer from hyperacusis, anyway. Scans show exceptionally active temporal lobes.
  18. Jun 18, 2010 #17


    User Avatar
    Gold Member

    I understand humans have mulitsensed perceptions with benefits like added affirmation and precision. I’ve read the gustatory and olfactory sensations are difficult to distinguish between and are particularly old senses both with important benefits in the detection, and interpretation of quality, of food. Benefits of audio-visual sensory integration would involve precise location of stimulus.

    This is book seems a good over-view from 2004. Chapter 2, on page 27 is about audio-visual perception in particular-

    http://books.google.co.uk/books?id=...A#v=onepage&q=stein cell multisensory&f=false

    Parts of the book describe cross-modality, hetromodality, multisensory neurons, synaesthesia, etc.. This is an example of the papers mentioned-

    I had mentioned the McGurk effect before here, (again-
    ) and understand that the youtube I presented described a typical response, but that reactions vary and may hypothetically depend on individual modal strengths and weaknesses. I don’t have that response (and can think of a reason why that may be the case).
    Last edited by a moderator: Sep 25, 2014
  19. Jun 20, 2010 #18


    User Avatar
    Science Advisor

    Yes, it is indirect control. But what does it "feel" like? After all, the best musicians surely feel that their instruments are part of themselves, else how could they truly "speak" through them?
  20. Jun 21, 2010 #19
    I don't know where you're going with this. The OP is wondering why some populations of neurons present an "auditory" experience to consciousness and others present a "visual" experience. Becoming proficient with a musical instrument, or any similar skill, doesn't change that, that I'm aware of.
  21. Jun 21, 2010 #20


    User Avatar
    Science Advisor

    The analogy being:

    What makes a thing feel "visual" or "auditory"?

    What makes a thing feel "external" or "part of you"?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook