Human Ear perceptions of sound

  • Medical
  • Thread starter bassplayer142
  • Start date
  • Tags
    Human Sound
In summary: The other theory is phase. When the sounds at each ear are in phase, the sound is likely to be directly ahead, behind, above or below. when the phase one ear is marginally ahead, it pmeans that that ear is nearer to the sound. ("likely to be" because if the wavelength is too short, it wouldn't work - it could be out of phase by more than one wavelength)This has a difficulty because neurons can only fire 2000 times a second maximum, and the phase difference can be less than 1/1600th of a second which is not fast enough. There are a couple of theories about how this may be possible, both similar, by Jeffress, and an import
  • #1
bassplayer142
432
0
How exactly does your mind tell you from where sound is coming from? If you hear something next to you, you can tell if it is on the right or left side. I can't understand how sound compression waves could tell.
 
Biology news on Phys.org
  • #2
According to this paper... "the eyes have it".

Eye-position effects in directional hearing.

The influence of gaze direction on azimuthal sound localization was investigated by presenting free-field acoustical stimuli in combination with a visual fixation task. In Experiment 1, a two-alternative forced-choice method was employed. While fixating visual targets, subjects judged whether noise bursts, presented from various directions, were perceived as being on the left or right of either a visual reference indicating straight ahead or the subjective straight-ahead direction. The psychometric functions measured with the first task shifted consistently opposite to the direction of eccentric gaze, i.e., the location of the auditory stimulus was perceived as shifted toward the direction of gaze. The mean magnitude of the shift was 4.7 degrees over a range of fixation angles up to 45 degrees on either side. Without an external reference indicating straight ahead, shifts of sound localization were inconsistent, either opposite or toward the direction of fixation in individual subjects. In Experiment 2, subjects orientated their head toward sound stimuli while fixating visual targets in various directions. As in Experiment 1, head position as a measure of sound localization shifted significantly toward the direction of eccentric gaze when a visual reference of the head median plane was present, and the results were inconsistent across subjects when it was absent. The results indicate a significant effect of gaze direction on the spatial agreement of auditory and visual perception which may be based on the superposition of distinct auditory and visual eye-position effects. The effect is in agreement with previous neurophysiological results that have suggested an incomplete neural transformation of auditory spatial coordinates from a craniocentric into an oculocentric frame of reference.

http://www.ncbi.nlm.nih.gov/sites/entrez?db=pubmed&list_uids=9331472&cmd=Retrieve&indexed=google
 
  • #3
There are 2 theories, phase and loudness.
It may be just that the loudness in the left ear is greater than the ear is greater than the right, indicating that the sound is to the left. The tone of a familiar sound will suggest that it is in front or behind The brain also uses reason to figure out where it will be.
This explains why it is difficult to locate low frequency sound sources, since low frequency sound waves easily travel around corners and the sound will be the same volume at each ear.

The other theory is phase. When the sounds at each ear are in phase, the sound is likely to be directly ahead, behind, above or below. when the phase one ear is marginally ahead, it pmeans that that ear is nearer to the sound. ("likely to be" because if the wavelength is too short, it wouldn't work - it could be out of phase by more than one wavelength)
This has a difficulty because neurons can only fire 2000 times a second maximum, and the phase difference can be less than 1/1600th of a second which is not fast enough. There are a couple of theories about how this may be possible, both similar, by Jeffress, and an importand computer scientist calle Licklider, which suggest two parallell delay lines of synapses, one having longer lengths of axon which slow down the signal from one ear, a bit like a wonky ladder. Only the pair of neurons that are in phase will generate an action potential. So identifying the rung of the ladder that has matched neurons at either end would identify the phase difference.

I personally think the first theory about loudness is adequate to explain sound localisation.

Currently I can vaguely hear a neighbour sanding rust from his car. My first impression is that I alread know the direction the sound is coming from, and my second thought is that this observation is consistant with what I can hear in both ears. He is outdoors. I am indoors near a window. If I was blindfolded and didn't know where I was or where he was, I would have difficulty in identifying the direction the sound was coming from.
 
Last edited:
  • #4
I had a physics professor who had no [or limited] perception of sound direction due to hearing loss from artillary fire in WWII.

I joked in class that a large pair of parabolic reflectors mounted on the back of his head might help :biggrin:, which, luckily, he found amusing. But the point is that he believed it was a function of loudness - he thought the parabolic reflectors might work! :biggrin:
 
  • #5
tonyjeffs said:
There are 2 theories, phase and loudness.
It may be just that the loudness in the left ear is greater than the ear is greater than the right, indicating that the sound is to the left. The tone of a familiar sound will suggest that it is in front or behind The brain also uses reason to figure out where it will be.
This explains why it is difficult to locate low frequency sound sources, since low frequency sound waves easily travel around corners and the sound will be the same volume at each ear.

The other theory is phase. When the sounds at each ear are in phase, the sound is likely to be directly ahead, behind, above or below. when the phase one ear is marginally ahead, it pmeans that that ear is nearer to the sound. ("likely to be" because if the wavelength is too short, it wouldn't work - it could be out of phase by more than one wavelength)
This has a difficulty because neurons can only fire 2000 times a second maximum, and the phase difference can be less than 1/1600th of a second which is not fast enough. There are a couple of theories about how this may be possible, both similar, by Jeffress, and an importand computer scientist calle Licklider, which suggest two parallell delay lines of synapses, one having longer lengths of axon which slow down the signal from one ear, a bit like a wonky ladder. Only the pair of neurons that are in phase will generate an action potential. So identifying the rung of the ladder that has matched neurons at either end would identify the phase difference.

I personally think the first theory about loudness is adequate to explain sound localisation.

Currently I can vaguely hear a neighbour sanding rust from his car. My first impression is that I alread know the direction the sound is coming from, and my second thought is that this observation is consistant with what I can hear in both ears. He is outdoors. I am indoors near a window. If I was blindfolded and didn't know where I was or where he was, I would have difficulty in identifying the direction the sound was coming from.

The evidence I've seen better supports that it's distortion of the sound wave that's detected. As the sound waves pass around the head, and are reflected off the body, they are slightly distorted, so the ear closer to the sound is receiving a slightly different wave pattern than the ear further from the sound. Note, however, that many studies on perception of direction using acoustics are performed in flies or other insects (it's relatively easy to stick them on a trackball and record in an objective manner the direction they are running in response to different qualities and directions of sounds or when stimulating specific neurons), and they have a distinctly different anatomy from humans that may alter what characteristics of sound they are using for direction-finding. Instead of having two ears that contain separate membranes (eardrums) for detecting sound vibration, they have an organ on their thorax that is shaped more like a sheet of paper folded in half, and the difference in vibration between the two halves help them determine sound. Because these are SO close together, especially relative to the spacing of ears on larger animals, it is more likely they are detecting phase differences rather than pressure differences in volume or sound distortion.
 
  • #6
The right and left ears actually process sound differently before it reaches the brain, at least, according to relatively recent research. Like muscle memory, the right ear usually processess the spoken word better than the left, and the left processes things like music better than the right as you might expect in split brain theory. In some cases this can cause a noticable difference in how well the person can triangulate on where a sound is coming from, depending upon what type of sound it is.

Another recent discovery is that the human brain functions somewhat like an FM receiver. It compares incoming signals against its own self-generated carrier waves to derive their differential. This provides a much faster and accurate response than exhastively analyzing a signal would. In the case of sound, the carrier wave moves from side to side in the brain, for sight front to back, for touch top to bottom, etc.

In other words, there are a lot of variables involved and the last word has yet to be said on the subject.
 

1. What is the human ear's range of perception for sound?

The human ear can typically perceive sound frequencies between 20 Hz to 20,000 Hz. However, this range can vary from person to person and can decrease with age or hearing loss.

2. How does the human ear perceive loudness?

The human ear perceives loudness through the intensity or amplitude of sound waves. The greater the amplitude, the louder the sound will be perceived.

3. How does the human ear distinguish between different pitches of sound?

The human ear distinguishes between different pitches of sound through the frequency of sound waves. Higher frequencies correspond to higher pitches, while lower frequencies correspond to lower pitches.

4. How does the human ear localize sound?

The human ear is able to localize sound through a combination of factors, including differences in sound arrival time, sound intensity, and the shape of the ear. This allows us to determine the direction and distance of a sound source.

5. How does the human ear perceive sound quality?

The perception of sound quality is influenced by various factors, including pitch, loudness, and timbre. Timbre refers to the unique characteristics of a sound, such as its tone or quality, which allows us to distinguish between different instruments or voices.

Similar threads

Replies
9
Views
1K
Replies
31
Views
700
  • Introductory Physics Homework Help
Replies
3
Views
197
Replies
10
Views
902
Replies
5
Views
2K
  • Biology and Medical
Replies
4
Views
1K
  • General Discussion
Replies
1
Views
190
Replies
8
Views
1K
  • Introductory Physics Homework Help
Replies
20
Views
2K
Back
Top