Sound wave intensity (dB) and sound processing

  • Thread starter fog37
  • Start date
  • #1
fog37
1,402
95
Hello Forum,

A sound wave intensity (pure frequency) is proportional to the square of the wave pressure amplitude, i.e. ##I \approx p_0^2##, where ##p_0## is the pressure wave amplitude: ##p_0 sin(\omega t \pm kx)##. This means that the (gauge) pressure value goes larger (positive) and lower (negative) than the local atmospheric pressure ##p_atm##.
The sound intensity can also be expressed in dB. In that case, the reference pressure is the pressure amplitude ##p_{min}## associated to the faintest, barely audible sound: $$I (db) \approx log \frac{p_0^2}{p_{min}^2}$$
A microphones connected to sound processing software converts the impinging sound pressure into an analog voltage signal which the software displays in terms of its time-varying intensity ##I##. The analog voltage signal is sampled (sampling rate) and converted to a digital signal using a certain a bit depth (8, 16, 24 bits, etc. the higher the bit depth the better I guess).
When the software reports the sound intensity ##I## as db values , the dB range goes from a maximum 0 dB down to negative dB values because the intensity is not calculated using the pressure amplitude ##p_{min}## associated to a barely audible sound but it is relative to the loudest sound pressure in the time interval.

How can I make sure that the dB intensity values that the software reports are the same dBs commonly calculated and representing the sound loudness (equation above) to get a sense of how truly loud the signal is?
What kind of calibration is needed? I know the microphone and software have gains, etc...

thanks!
 

Answers and Replies

  • #2
tech99
Science Advisor
Gold Member
2,645
1,189
I presume the software effectively squares the analogue voltage to obtain intensity.
You have to add a fixed number of decibels to your reading to make it run from zero up. For instance, if you take a measurement in a quiet country house at night with no traffic or aircraft noise etc, suppose your reading is -110dB. Then roughly speaking you need the instrument to indicate zero dB, so you need to ad 110dB to the reading.
In order to obtain a calibration for the instrument, you can only compare it with a professional one I think.
Notice also that some instruments use the dBA scale, which includes a weighting filter to simulate the frequency response of the ear.
 
  • #4
fog37
1,402
95
In general, the job of a microphone is to convert the captured pressure signal ##p(t)## into a voltage signal ##V(t)##.

For fidelity (assuming no distortion and noise) the shape of the voltage signal V(t), which is output from the microphone, should be exactly the same as the shape of the signal p(t) except for a scaling factor. That means that there must be a linear relationship between the pressure and voltage.
Sound systems always have amplifier between the mic and the software to increase the voltage signal ##V(t##) amplitude (maybe too small) but linearity must be there or the voltage signal will not represent faithfully the pressure signal ##p(t)##. Is that correct? The human ear works differently and nonlinearly: pressure changes don't produce linear changes in the perceived loudness...but that is different...
 

Suggested for: Sound wave intensity (dB) and sound processing

Top