How do you test the accuracy of a microphone?

  • Thread starter Thread starter BeautifulLight
  • Start date Start date
  • Tags Tags
    Accuracy Test
AI Thread Summary
Testing the accuracy of a microphone can be achieved by using a signal/tone generator alongside an oscilloscope to measure the output at specific frequencies and amplitudes. This method allows for the comparison of multiple microphones to determine their accuracy in capturing sound. However, the subjective nature of sound perception complicates the definition of accuracy, as it can vary based on individual experience during the recording. The discussion highlights the importance of distinguishing between the technical recording signal and the personal auditory experience of the recording. Overall, understanding both objective measurements and subjective perceptions is crucial in evaluating microphone accuracy.
BeautifulLight
Messages
39
Reaction score
0
With another microphone? And how do you test the accuracy of that microphone? With yet another microphone? ...and on and on we go

I suppose you're probably thinking to yourself, "well that's easy BeautifulLight, just set up an experiment using a signal/tone generator, couple of microphones, and an oscilloscope! Use the signal/tone generator to output a unique signal, let's say X amplitude at Y frequency, "into" each microphone. With the aid of the oscilloscope, you should be able to check that each microphone is indeed picking up X amplitude at Y frequency. If all microphones check out, then all microphones are accurate." Okay, so maybe I'm going to get some crap about what variables I should be testing, in this case, freq and amplitude, but you know what I mean:smile:


Is this correct?


I do believe you can check the validity of a microphone using frequency, amplitude, etc. These are elements of the objective world (constructions of science) and therefore can be tested. However, sound (as with light) is a construction of a subjective perceptual process (think pitch & how it differs from freq), so in my opinion, the only ideal reference signals we have are the ones that are picked up by the two funny looking things on either sides of your head.


If you are curious, this thread arises from the question of whether or not accurately reproducing a recording is subjective. You need to be really careful how you define recording. You can define recording as the unique signal your microphone picked up OR you can define your recording as how the song sounded to YOU during that initial recording -just assume you sat in on the recording.



Thoughts? I know there are a few EE's on here. Maybe someone that has worked in the music industry can elaborate on what variables come into play when testing the accuracy of electric microphones. *I assume signal/tone generators aren't plugged directly into microphones. Of course, they'd play X amplitude at Y frequency -you're forcing it to. It's not a speaker!
 
Last edited:
Engineering news on Phys.org
Thread 'Weird near-field phenomenon I get in my EM simulation'
I recently made a basic simulation of wire antennas and I am not sure if the near field in my simulation is modeled correctly. One of the things that worry me is the fact that sometimes I see in my simulation "movements" in the near field that seems to be faster than the speed of wave propagation I defined (the speed of light in the simulation). Specifically I see "nodes" of low amplitude in the E field that are quickly "emitted" from the antenna and then slow down as they approach the far...
Hello dear reader, a brief introduction: Some 4 years ago someone started developing health related issues, apparently due to exposure to RF & ELF related frequencies and/or fields (Magnetic). This is currently becoming known as EHS. (Electromagnetic hypersensitivity is a claimed sensitivity to electromagnetic fields, to which adverse symptoms are attributed.) She experiences a deep burning sensation throughout her entire body, leaving her in pain and exhausted after a pulse has occurred...
Back
Top